meanwhile my Lulu alternative to littlesnitch is barely leaking anything after running for weeks:
sudo leaks com.objective-see.lulu.extension | grep "total leaked bytes"
Password:
Process 851 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 851: 1086 leaks for 108576 total leaked bytes.
"Apple also broke many aspects of networking in macOS 15 (Sequoia). Until Apple releases a fix, the solution for now appears to be to disable the macOS firewall."
Presumably doing that would likely also work around the problem for littlesnitch?
I wish there was an independent unit test suite for operating systems and other proprietary software.
The suite would run the most-used apps and utilities against updates and report regressions.
So for example, the vast majority of apps on my Mac can't run, because they were written for early versions of OS X and OS 9, even all the way back to System 7 when apps were expected to still run on 4/5/6. The suite would reveal that Apple has a track record of de-prioritizing backwards compatibility or backporting bug fixes to previous OS versions.
You don’t need to do anything special to “reveal” that Apple doesn’t prioritize backwards compatibility. That is very well known. For example, standard practice for audio professionals is to wait a year or more to upgrade MacOS, to give all the vendors a chance to fix what broke.
Funnily enough, that’s what the UNIX™ certification is, in some—much too limited for your purposes—sense :) See also Raymond Chen’s story of buying one of everything[1].
Eh, I agree in a sense, but I'm also ok without the same level of backwards compatibility that Windows is beleaguered by. Every new version of Windows is little more than a thin veneer of whatever they think is a popular choice for UI design that year, and with that comes a clumsy amalgamation of hugely varying settings dialogs, the classic registry, all the goop. Meanwhile on macos, I don't expect very complex software to maintain perfect compatibility, but I can reasonably expect most of the stuff I use to carry forward 5+ years. Parallels and Omnifocus were the exceptions, but 1password from 2012 is still kicking, Data Rescue 3 somehow still works, I'm sure even Adobe CS6 would even though it's from the Carbon era.
Just as well, although I loathe some of the choices Apple's made over the years, such as it's own Settings app, the overall UI would be pretty recognizable if me from 20 years ago found a time machine (pun intended). I recently bought a new mac, and it occurred to me that it feels basically like the E-Mac I used in middle school all those years ago, albeit with the occasional annoyance I wouldn't have been aware of then.
Out of curiosity, I just checked, and while the CS6 installer is 32-bit, Photoshop CS6, at least, is 64-bit.
The .app icon shows the "circle slash" overlay, however, and attempting to launch it from the Finder (Sequoia 15.1 running on an Intel Mac) yields the OS-level "needs to be updated" alert without actually exec'ing the binary.
The Mach-O executable in "Contents/MacOS" loads and runs successfully when called directly from a shell prompt, however, and displays an application-generated "Some of the application components are missing…Please reinstall…" alert.
Which is actually encouraging, given that I'm attempting to run it directly from the Master Collection .dmg image without actually installing anything, which, given all the prerequisite detritus Adobe apps habitually scatter around the system when installed, I wouldn't expect to work even on a supported OS.
Less encouraging is the fact that the app-generated alert box text is blurry, suggesting the application wouldn't properly support Retina displays even if it could be cajoled into running.
Interesting experiment, thanks for the detail, I think I do still have my installers backed up somewhere, if not the actual disks.
> Less encouraging is the fact that the app-generated alert box text is blurry, suggesting the application wouldn't properly support Retina displays even if it could be cajoled into running.
This was actually the main reason I simply stopped using it (aside from not needing it professionally anymore and Adobe switching to subscription after CS6). CS6 was the last version before laptops started shipping with high dpi screens, and Carbon (from what I understood at the time) was simply the older cocoa UI framework that was replaced as Apple switched to a more versatile SDK. Sibling commentor suggested it was because Carbon was 32-bit only and that seems plausible, I hadn't experimented heavily with Obj-C or Apple dev, but I'm sure the switch was a massive undertaking.
64-bit Carbon (as a port of 32-bit Carbon, which itself was an aid for porting apps from classic Mac OS to OS X) was originally loudly announced and then relatively quietly killed[1]. Not clear if any code was ever actually written, but given the announcement was at a keynote I expect that somebody, somewhere, at least judged it feasible.
Huh? In service of what? There’s not all that much inherently good about backwards compatibility, but you’re really implying that deprioritising it is a misdeed. If I wanted to use an OS that prioritised backwards compatibility more than macOS, I’d use Windows, and suffer through the downsides of that trade-off. I’m happy using an OS that balances things in a way that’s more in line with my priorities.
This isn't backwards compatibility though - the example in the post here is a major bug in an actively supported API.
Apple dropping support for old things over time is a reasonable philosophy, but Apple breaking current things unintentionally and then neither fixing nor communicating about it, primarily because they don't actively engage with their ecosystem in general, is a problematic choice on their part.
base sudo leaks at.obdev.littlesnitch.networkextension | grep "total leaked bytes"
Password:
Process 310 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 310: 314990 leaks for 967643488 total leaked bytes.
brett@algol minikube / default ~/Documents/misc sudo leaks at.obdev.littlesnitch.networkextension | grep "total leaked bytes"
Password:
Process 43619 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 43619: 2194911 leaks for 6742615664 total leaked bytes.
Process 575 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 575: 747950 leaks for 2294465728 total leaked bytes.
Damn if only they told us yesterday before I restarted for the first time in a month. I wonder how big my memory leak would have been. I have only been online for about 11 hours (~9 of those were in hibernation) now and already at a 13MB leak.
I reported that to them and at least based on the interaction they have no idea for last few months. I just have to restart them or the whole MacBook Air from time to time. I suspect it is not their issue lately as I saw others have this issues. Well … now I know. Or may be I just guess.
Same for me! On my laptop, Safari will eventually stop loading web pages after a few days of uptime, guaranteed, until I reboot. I hadn't had that on my Mac Studio, but it has much more RAM to endure the leakage longer.
I generally don't sleep my macOS machines these days, as hardware has gotten faster and faster, the pain of booting up is less and less. Unless I want to be able to wake on network etc., at least.
> For the time being, until Apple fixes this serious bug in macOS, we therefore highly recommend to turn off the built-in firewall of macOS when also using Little Snitch or Little Snitch Mini.
I remember back in the day when installing two firewalls or two antivirus programs on Windows would break it, so it will have to be reinstalled. That was 20 years ago, though, one would think we're better at making an OS by now.
We like to wishfully think of human systems (software, government, anything) as immune systems that accumulate knowledge in the system itself over time so that it's increasingly resilient to the systemic problems it's encountered before.
Instead, human systems require eternal vigilance from the humans inside it. Even governmental systems which can encode knowledge into laws rely on the eternal vigilance of judges, prosecutors, and defenders to utilize that knowledge.
So GGz if you're writing a new subsystem in an OS and you're expected to learn from mistakes a team of two people made in some subsystem 20 years ago that someone quietly patched.
True, and having the benefit of hindsight, it’s easy for us to judge.
The trouble is, Apple’s feedback process is so opaque that we can never know the details. All we have is the feeling of “a simple test of macOS with a third party firewall before unleashing it to the world would have shown the problem”.
For a piece of software on which countless people rely upon (which macOS and iOS are), the “beta” begins after exhausting all internal means of detecting regressions and unwanted behaviour. It’s not cheap but they can’t just dump something and expect unpaid, third party developers to report all the bugs (while never getting a reply on that feedback app).
I mean... sounds like we are if you only have to turn off one of the firewalls and not reinstall. I think ancient windows firewalls would routinely replace the system networking driver files, and that's why things got really messy. At least we're beyond that.
Afaik the macOS port of OpenBSD's pf firewall is the only firewall used by both Apple's system settings and obdev's LittleSnitch. They're both GUI frontends to the same backend, but Apple supposedly also added internal "escape hatches" to their PF port because they couldn't be arsed to write/generate a proper ruleset with anchors to hook into.
The cynic in me assumes it's just teams from different silos trampling over each other in a shared code base. Given Apple's obsession with leak prevention they're probably prohibited by NDA from talking to each other.
Make it harder to use the original way, push developers to a suboptimal mechanism and deprecate the original way, then eventually deprecate and remove extensions entirely.
"See? This is why extensions are bad!"
It's 100% in Apple's culture to do so. They don't even need to do it deliberately --- just ignore the inevitable bugs that appear.
Kinda lousy that they tell people to open additional feedbacks when they gave no information about the behavior they are actually seeing with network extensions (leaking where? what sort of data?)
A memory leak is essentially when a program uses more and more memory because it loses track of some of it so it can't give it back to the OS. You're thinking of another kind of leak.
meanwhile my Lulu alternative to littlesnitch is barely leaking anything after running for weeks:
sudo leaks com.objective-see.lulu.extension | grep "total leaked bytes" Password: Process 851 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 851: 1086 leaks for 108576 total leaked bytes.
The Lulu download page says
"Apple also broke many aspects of networking in macOS 15 (Sequoia). Until Apple releases a fix, the solution for now appears to be to disable the macOS firewall."
Presumably doing that would likely also work around the problem for littlesnitch?
I wish there was an independent unit test suite for operating systems and other proprietary software.
The suite would run the most-used apps and utilities against updates and report regressions.
So for example, the vast majority of apps on my Mac can't run, because they were written for early versions of OS X and OS 9, even all the way back to System 7 when apps were expected to still run on 4/5/6. The suite would reveal that Apple has a track record of de-prioritizing backwards compatibility or backporting bug fixes to previous OS versions.
Edit: integration test suite
You don’t need to do anything special to “reveal” that Apple doesn’t prioritize backwards compatibility. That is very well known. For example, standard practice for audio professionals is to wait a year or more to upgrade MacOS, to give all the vendors a chance to fix what broke.
Even 15 years ago the common knowledge was to never upgrade to major versions of Apple software, and wait for a .2 release, at least.
However, these days it seems that even point releases only introduce new bugs in the rush to deliver late features, and rarely address any issues
I have to disagree. Sequoia .0 was spectacularly broken and .1 is a very noticeable improvement.
…of course I’d rather stay on Sonoma if I could go back in time…
IT departments installing MDM trashware which forces upgrades is the problem.
And the compliance-industrial complex that incentivizes/forces that behavior.
Funnily enough, that’s what the UNIX™ certification is, in some—much too limited for your purposes—sense :) See also Raymond Chen’s story of buying one of everything[1].
[1] https://devblogs.microsoft.com/oldnewthing/20050824-11/?p=34...
Eh, I agree in a sense, but I'm also ok without the same level of backwards compatibility that Windows is beleaguered by. Every new version of Windows is little more than a thin veneer of whatever they think is a popular choice for UI design that year, and with that comes a clumsy amalgamation of hugely varying settings dialogs, the classic registry, all the goop. Meanwhile on macos, I don't expect very complex software to maintain perfect compatibility, but I can reasonably expect most of the stuff I use to carry forward 5+ years. Parallels and Omnifocus were the exceptions, but 1password from 2012 is still kicking, Data Rescue 3 somehow still works, I'm sure even Adobe CS6 would even though it's from the Carbon era.
Just as well, although I loathe some of the choices Apple's made over the years, such as it's own Settings app, the overall UI would be pretty recognizable if me from 20 years ago found a time machine (pun intended). I recently bought a new mac, and it occurred to me that it feels basically like the E-Mac I used in middle school all those years ago, albeit with the occasional annoyance I wouldn't have been aware of then.
Out of curiosity, I just checked, and while the CS6 installer is 32-bit, Photoshop CS6, at least, is 64-bit.
The .app icon shows the "circle slash" overlay, however, and attempting to launch it from the Finder (Sequoia 15.1 running on an Intel Mac) yields the OS-level "needs to be updated" alert without actually exec'ing the binary.
The Mach-O executable in "Contents/MacOS" loads and runs successfully when called directly from a shell prompt, however, and displays an application-generated "Some of the application components are missing…Please reinstall…" alert.
Which is actually encouraging, given that I'm attempting to run it directly from the Master Collection .dmg image without actually installing anything, which, given all the prerequisite detritus Adobe apps habitually scatter around the system when installed, I wouldn't expect to work even on a supported OS.
Less encouraging is the fact that the app-generated alert box text is blurry, suggesting the application wouldn't properly support Retina displays even if it could be cajoled into running.
Interesting experiment, thanks for the detail, I think I do still have my installers backed up somewhere, if not the actual disks.
> Less encouraging is the fact that the app-generated alert box text is blurry, suggesting the application wouldn't properly support Retina displays even if it could be cajoled into running.
This was actually the main reason I simply stopped using it (aside from not needing it professionally anymore and Adobe switching to subscription after CS6). CS6 was the last version before laptops started shipping with high dpi screens, and Carbon (from what I understood at the time) was simply the older cocoa UI framework that was replaced as Apple switched to a more versatile SDK. Sibling commentor suggested it was because Carbon was 32-bit only and that seems plausible, I hadn't experimented heavily with Obj-C or Apple dev, but I'm sure the switch was a massive undertaking.
64-bit Carbon (as a port of 32-bit Carbon, which itself was an aid for porting apps from classic Mac OS to OS X) was originally loudly announced and then relatively quietly killed[1]. Not clear if any code was ever actually written, but given the announcement was at a keynote I expect that somebody, somewhere, at least judged it feasible.
[1] https://www.macrumors.com/2007/06/13/leopard-drops-carbon-64...
CS6 is after the Carbon2Cocoa effort, IIRC. No 32bit apps run on modern macOS and Carbon was infamously 32bit only.
The Android CTS is essentially this for device OEMs. https://source.android.com/docs/compatibility/cts – it's a set of tests that a customised Android implementation must pass.
Huh? In service of what? There’s not all that much inherently good about backwards compatibility, but you’re really implying that deprioritising it is a misdeed. If I wanted to use an OS that prioritised backwards compatibility more than macOS, I’d use Windows, and suffer through the downsides of that trade-off. I’m happy using an OS that balances things in a way that’s more in line with my priorities.
This isn't backwards compatibility though - the example in the post here is a major bug in an actively supported API.
Apple dropping support for old things over time is a reasonable philosophy, but Apple breaking current things unintentionally and then neither fixing nor communicating about it, primarily because they don't actively engage with their ecosystem in general, is a problematic choice on their part.
base sudo leaks at.obdev.littlesnitch.networkextension | grep "total leaked bytes" Password: Process 310 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 310: 314990 leaks for 967643488 total leaked bytes.
Ouch!
brett@algol minikube / default ~/Documents/misc sudo leaks at.obdev.littlesnitch.networkextension | grep "total leaked bytes" Password: Process 43619 is not debuggable. Due to security restrictions, leaks can only show or save contents of readonly memory of restricted processes.
Process 43619: 2194911 leaks for 6742615664 total leaked bytes.
jesus.
Does Apple actually do QA? They really need to do another snow leopard release, where there are no new features, just bug fixing.
Definitely: They do try to avoid breaking their own first-party apps. Now third-party apps...
Just checked, I have 6.5GB of memory leak, only running Little Snitch for two days. Ouch!
Damn if only they told us yesterday before I restarted for the first time in a month. I wonder how big my memory leak would have been. I have only been online for about 11 hours (~9 of those were in hibernation) now and already at a 13MB leak.
I’ve been restarting my MacBook weekly for 2 years now. It’s way more than I’ve done this with Windows.
I reported that to them and at least based on the interaction they have no idea for last few months. I just have to restart them or the whole MacBook Air from time to time. I suspect it is not their issue lately as I saw others have this issues. Well … now I know. Or may be I just guess.
Yeah, I stopped using it because of that.
This must be why my system becomes increasingly unstable over time ever since I upgraded to Sequoia. I've had to reboot quite regularly.
Same for me! On my laptop, Safari will eventually stop loading web pages after a few days of uptime, guaranteed, until I reboot. I hadn't had that on my Mac Studio, but it has much more RAM to endure the leakage longer.
I generally don't sleep my macOS machines these days, as hardware has gotten faster and faster, the pain of booting up is less and less. Unless I want to be able to wake on network etc., at least.
Apple's frameworks, especially in betas, often have memory leaks.
Apple's frameworks must be perpetually in beta.
Must be all that Swift goodness they impose on us… ;)
turns out Swift is pretty difficult to use in frameworks compared to other executables
How so?
See also yesterday's "Apple’s built-in macOS firewall breaks third-party firewalls" https://obdev.at/blog/apples-built-in-macos-firewall-breaks-...
> For the time being, until Apple fixes this serious bug in macOS, we therefore highly recommend to turn off the built-in firewall of macOS when also using Little Snitch or Little Snitch Mini.
I remember back in the day when installing two firewalls or two antivirus programs on Windows would break it, so it will have to be reinstalled. That was 20 years ago, though, one would think we're better at making an OS by now.
We like to wishfully think of human systems (software, government, anything) as immune systems that accumulate knowledge in the system itself over time so that it's increasingly resilient to the systemic problems it's encountered before.
Instead, human systems require eternal vigilance from the humans inside it. Even governmental systems which can encode knowledge into laws rely on the eternal vigilance of judges, prosecutors, and defenders to utilize that knowledge.
So GGz if you're writing a new subsystem in an OS and you're expected to learn from mistakes a team of two people made in some subsystem 20 years ago that someone quietly patched.
True, and having the benefit of hindsight, it’s easy for us to judge.
The trouble is, Apple’s feedback process is so opaque that we can never know the details. All we have is the feeling of “a simple test of macOS with a third party firewall before unleashing it to the world would have shown the problem”.
For a piece of software on which countless people rely upon (which macOS and iOS are), the “beta” begins after exhausting all internal means of detecting regressions and unwanted behaviour. It’s not cheap but they can’t just dump something and expect unpaid, third party developers to report all the bugs (while never getting a reply on that feedback app).
They can, because it’s what happens. It just sucks for those people.
I mean... sounds like we are if you only have to turn off one of the firewalls and not reinstall. I think ancient windows firewalls would routinely replace the system networking driver files, and that's why things got really messy. At least we're beyond that.
Afaik the macOS port of OpenBSD's pf firewall is the only firewall used by both Apple's system settings and obdev's LittleSnitch. They're both GUI frontends to the same backend, but Apple supposedly also added internal "escape hatches" to their PF port because they couldn't be arsed to write/generate a proper ruleset with anchors to hook into.
The cynic in me assumes it's just teams from different silos trampling over each other in a shared code base. Given Apple's obsession with leak prevention they're probably prohibited by NDA from talking to each other.
I think this is the one that broke Time Machine for everyone with a third-party firewall wall
= https://news.ycombinator.com/item?id=42135148
Ouch! I have 20 days uptime and 3.2GB of memory leaks!
Eeesh.
Process 665: 874477 leaks for 2686387600 total leaked bytes.
Make it harder to use the original way, push developers to a suboptimal mechanism and deprecate the original way, then eventually deprecate and remove extensions entirely.
"See? This is why extensions are bad!"
It's 100% in Apple's culture to do so. They don't even need to do it deliberately --- just ignore the inevitable bugs that appear.
Now we know why they bumped Macs to 16GB minimum ;-)
Yet another reason why I should not be updating from Sonoma...
[flagged]
Kinda lousy that they tell people to open additional feedbacks when they gave no information about the behavior they are actually seeing with network extensions (leaking where? what sort of data?)
Not entirely sure you are upset when people do exactly what Apple has been telling everyone to do for years if they want their bug fixed
A memory leak is essentially when a program uses more and more memory because it loses track of some of it so it can't give it back to the OS. You're thinking of another kind of leak.