There are some factual "gaps" there about how good Snow Leopard was, but I understand the sentiment. As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
It's just that me and other old-time switchers have stopped complaining about it and moved on (taoofmac.com, my blog, was started when I wrote a few very popular switcher guides, and even though I kept using the same domain name I see myself as a UNIX guy, not "just" a Mac user).
For me, Spotlight is no longer (anywhere) near as useful to find files (and sometimes forgets app and shortcut names it found perfectly fine 5 minutes ago), and there is no longer any way to effectively prioritize the results I want (apps, not internet garbage).
Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).
> Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).
I'm there as well. I've been really enjoying desktop Linux lately, but I can't go back to a non-Apple laptop at this point. There's just nothing else on the market that comes close, they all make some tradeoff I'm not willing to make - either screen, speakers, keyboard, heat/battery life/fan noise, touchpad, etc. Apple is the only one that has the entire package.
There's Asahi, but no thunderbolt yet and I'm not sure the future of that project with the lead burning out and quitting. I just want an Apple Silicon-esque laptop, no trade offs on components, that runs Linux, and there's no OEM out there that's offering that experience.
So, until that happens I'm staying on mac, and even with declining quality, it's not all that bad compared to the alternatives yet. I've learned to mostly work around/ignore the odd bugs.
Also while Apples software quality has definitely diminished over the years, Windows in the same period has utterly CRATERED. Like I get along fine with 11 for my gaming PC but with every single update one feature or another becomes notably broken.
My job gave me an expensive high-specced laptop with Windows on it. This is the first time I am stuck using Windows daily. It's W10. With Windows Defender and a bunch of windows, it starts to slowly become unusable. Today, it blue screened for me just fixing again (and again and again) the bluetooth headphones never gets automatically switched to when I turn them on. Forget about having Visual Studio open on it for an extended period of time.
Meanwhile, my 7-year old laptop with Fedora on it I type this is wonderfully snappy and stable. I started to get tempted to actually switch back to a Mac just to get some predictability and stability, but I have avoided macs for years. (And - never having to deal with constant line ending issues)
All I hear from other co-workers is how their perfectly specced laptops lag with Windows. It's freaking Stockholm Syndrome here!
Might be, still I no longer feel like baby sitting Linux on laptops as I used to, yes I had another go at it just last year I know how well it works, and I will never pay the Apple tax outside assigned work laptops.
Windows is indeed an execrable shitshow. Every aspect of it assaults the user with incompetence or outright hostility.
First is the endless badgering to log in, LOG IN, LOGGGG INNNNN with an asinine Microsoft account. If you can tolerate that and actually get the OS running, you're wading through a wonderland of UI regressions and defects.
The default hiding and disabling of options is infuriating. Try showing content from your Windows computer on a TV, for example. You plug your HDMI cable in, and you can select the TV as an external monitor in a reasonably logical manner. Great.
But wait... the sound is still coming from the laptop speakers. So you go to Sound in the system settings. Click on the drop-down for available devices. NOPE; the only device is the laptop speakers.
So you start hunting through "advanced settings" or some such BS. And buried in there you find the TV, detected all along, but DISABLED BY DEFAULT. WHY??? Not auto-selecting it for output is one thing, but why is it DISABLED and HIDDEN?
This is the kind of shit I have to talk my parents through over the phone so they can watch their PBS subscription on their TV. The sheer stupidity of today's Windows UI isn't just annoying, but it's demoralizing to everyday people who blame THEMSELVES for not being "computer-savvy" or slow learners. NO; it's Microsoft's monumental design incompetence and user-hostile behavior.
Microsoft doesn't get the relentless excoriation it deserves for its miserable user experience. There's no excuse for it.
I can't say I've ever had HDMI audio mystery-disabled when I try and use it, that's for sure a new one for me. That said the entire audio stack is an utter fucking nightmare. Selecting sound devices usually works, unless the game/software you're using either isn't set up to know about it, or isn't being told by Windows, either or. Then of course there's the fun game you play having two HDMI displays where Windows will constantly re-enable one you've disabled because it doesn't have any fucking speakers on it.
Win 11 must have some sort of contextual HDMI audio switching where it figures out exactly where you want your audio to go and then does the opposite. Because my Win 11 work laptop loves to re-enable HDMI audio and make it the active audio connection despite the fact that neither of my monitors have built in speakers.
Fedora atomic distributions in general are great. I recommend Bluefin-dx over Bazzite (they’re both GNOME-based Fedora Atomic from the same group— universal blue) for developers, because it’s really easy to install the packages that Bazzite gives you, and it comes pre-installed with Docker.
Yep Bazzite is great. But the difference between them is mostly just the packages installed. To me it’s easier to install the gaming related packages from Bazzite onto Bluefin.
I have a problem with Docker sockets while installing onto Bazzite, and didn’t care enough to look further into it.
Is it comparable to gaming on Windows? Last time I tried the performance wasn't as good for some games (Deadlock) and it took ages to compile shader (it takes 30 seconds on Windows with the same specs)
I saw long shader compile times for at least one game last month, might have been Deadlock. I have a Radeon RX 7600 & Ryzen 9 7900X3D for reference.
There is mention on the arch wiki about enabling multi-threaded compiles, but also I have read you perhaps dont even need to precompile them now and possibly get better performance as the JIT compiles via a different vulkan framework (VK_EXT_graphics_pipeline_library).
I disabled pre-caching (which effects the compile too afaict) and never noticed any stuttering, possibly past some level of compute it's inconsequential. I also noticed that sometimes the launcher would say "pre-compiling" but actually be downloading the cache which was a lot slower on my internet.
Certainly on my (very) old intel system with a GTX1060, Sekiro would try to recompile shaders every launch, pegging my system at 99% and running for an hour. I just started skipping it and never really felt it was an issue, Sekiro still ran fine.
I have some hope that Framework and AMD can fix some of those issues. Would love to try out their new desktop (because it's a simpler, more tightly integrated thing) and replace my Mac mini -- then wait for Linux power management to improve.
Linux power management is pretty good. The problem is that defaults favor desktop and server performance. On a MacBook Air 11, my custom Linux setup and Mac OS had the same battery autonomy, despite Safari being much more energy efficient.
The real problem is that, just like the grandparent post pointed out, Apple's software quality has been declining. The Tiger to Snow Leopard epoch was incredible. Apps were simple, skeumorphic, and robust.
Right now, the whole system feels a lot less coherent and robustness has declined. IMHO, there are not so many extra features worth adding. They should focus on making all software robust and secure. Robustness should come from better languages that are safe by construction. Apple can afford to invest on this due to their vertical integration.
The iron law of bureaucracy happens because humans have a finite amount of time to spend doing things. Those dedicated to bureaucratic politics spend their time doing that, so they excel at that, while those dedicated to doing the work have no time for bureaucratic politics.
It's related to why companies with great marketing and fund raising but mediocre or off-the-shelf technology often win over companies with deeper and better tech that's really innovative. Innovation and polishing takes work that subtracts from the time available for fund raising and marketing.
Great insight—thanks for sharing. It strikes me that bureaucracy is inherently self-perpetuating- once established, it rewards compliance over creativity, steadily shifting the culture until innovation becomes the exception rather than the rule.
Perhaps the real challenge isn't balancing innovation and marketing—it's creating a culture that genuinely rewards bold ideas and meaningful risk-taking.
> [Bureaucracy] rewards compliance over creativity
Imho, this is the wrong takeaway from parent's point.
Bureaucracy rewards many things that are actual work and take time. (Networking, politicking, min/max'ing OKRs)
Creativity and innovation are rarely part of the list, because by definition they're less tangible and riskier.
A couple effective methods I've seen to fight the overall trend are (a) instill a culture where people succeed but processes fail (if a risky bet fails then the process goes under the spotlight, not the person) and (b) tie rewards to results that are less min/maxable (10x vs +5%).
It seems most organizations naturally become more risk-averse as they age and grow since the business becomes more well-defined over time and there is more to lose from risky ventures. The culture has to reward meaningful risk-taking even when that risk-taking results in a loss, which can cause issues when people see the guy who lost a bunch of money getting a bonus for trying (not to mention the perverse incentives it may create).
IMHO, Mac OS X contributed decisively towards making Apple cool, which was followed by lots of boutique apps and the success of iOS. Loosing that critical mass of developers, even if it's a tiny userbase, would worry me if I was a top leader of Apple.
Apple has had a contemptuous attitude towards developers since.. the App Store? when the iPhone was out? The last two decades? They don't seem to care about this.
App Store was a big improvement for developers when it was new, relative to the alternatives.
The things it does may not seem important today, but back then even just my bandwidth costs were a significant percentage of my shareware revenue.
ObjC with manual reference counting wasn't much fun either; while we can blame Apple for choosing ObjC in the first place, they definitely improved things.
This is a ret-con. If you - as a user - were philosophically and inherently against the App Store, then it may seem that way, I guess?
The reality is that there was a long period of time where Apple built up lots of goodwill with a developer ecosystem that exceeded by many orders of magnitude the pre-iPhone OS X indie Mac developer scene.
There were many, many developers that hadn’t even touched a Mac before the iPhone came out, and were happy with Apple, and now are certainly not.
Another way to see it is that people who programmed for Mac OS already had reasons to be annoyed by Apple (e.g. 64bit Carbon). The iPhone let it get new people, who eventually found out why the pre-iPhone scene felt that way.
I disagree - if the Vision Pro had some strong use-cases then developers would hold their nose and make apps for it. The platforms that get apps are the ones where businesses see value in delivering for them. Of course businesses prefer it when making apps is easier (read: cheaper) but this is not a primary driver.
I think the potential high-return use-cases for VR and AR are (1) games, (2) telepresence robot control, (3) smart assistants that label (a) people and (b) stuff in front of you.
Unfortunately:
1) AVP is about 10x too pricy for games.
2) It's not clear if it can beat even the cheapest headsets for anything important for telepresence (higher resolution isn't always important, but can be sometimes).
Irregardless, you need the associated telepresence robot, and despite the obvious name, the closest Apple gets to iRobot is if someone bought a vaccum cleaner because Apple doesn't even have the trademark.
3) (a) is creepy, and modern AI assistants are the SOTA for (b) and yet still only "neat" rather than actually achieving the AR vision since at least Microsoft's Hololens, and because AI assistants are free apps on your phone, they can't justify a €4k headset — someone would need a fantastic proprieraty AI breakthrough to justify it.
> What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?
Unrelenting bad press. People talking about nothing else but the decline of their software quality. We can already see that with the recent debacle which caused executive shuffling at the top of the company.
That shuffling was caused by Apple utterly failing to deliver a major feature, that was a key selling point for the latest generation of their hardware.
"Bad press" for their declining software quality is like people complaining there's no iPhone mini/SE anymore. Apple just doesn't give a fuck. They've joined the rest of the flock at chasing fads and quarterly bottom lines.
What was the major feature? The complete uselessness of “AI” on macOS? I updated and enabled all the AI features and I would ask Siri from my M1 and it failed every time. Would just continuously try with its annoying ping sound and never work. Blew my mind that they let this out.
Yeah I was talking about the "AI". It's such an utter failure that even Gruber has been calling it out.
It was already the same story with AirPower (the wireless charging mat). They've pre-announced it, even tried to upsell it by advertising it on the AirPods packaging. It just turned out physics is ruthless.
TBH I've been increasingly sceptical about voice assistants in the "pre-AI" era. I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.
A few months ago, for quite a few years, Siri (in the car) would respond correctly to "Play playlist <playlist name>". Now it interprets that as of about two months ago that it should play some songs of the genre (I have a playlist named "modern").
> I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.
I have almost the opposite problem this year. I tell the HomePod to turn the office lights on, it sometimes interprets this as a request to play music even though my library is actually empty, and the response is therefore to tell me that rather than turn on the lights.
Back in the pandemic, same problem with Alexa. Except it was in the kichen, so it said (the German equivalent of) "I can't find 'Kitchen' in your Spotify playlist" even though we didn't even have Spotify.
I think the best argument is to remind Apple that they aren't selling the OS anymore, so they don't need a new version every year. And that macOS features is not what is pushing Mac sales. People aren't buying the M series machines because of the new macOS version, they are buying it because of the hardware. The M series chips are impressive and provide some great benefits that you can't get elsewhere.
And that hardware needs to be coupled with solid software to hook and keep people on this computer. So they can take more time to create more compelling upgrades and sand off more edges.
I think they need to desync all their OS's and focus on providing better releases. There really is no benefit to spending the day updating your Mac, phone, tablet, appletv, and HomePod. Especially when there are no good reasons to update. I feel like Apple became far to addicting to habit and routine that it's become more important to keep that than deliver product. Apple Intelligence is a good example of that.
I’m pretty sure the touch target only covers the text label. Tap anywhere other than the text labels and it does nothing but close the menu. Really bizarre.
Apple is addicted to growth. It is as big as it should be, but it acts like an early stage startup always trying to build some new flashy thing to attract the next customer.
It's not Apple, it's capitalism. "Unlimited growth is the ideology of the cancer cell", yet for Apple (or any corporation), it's not good enough to sell 100,000,000 phones. Next year you must sell 105,000,000. And the year after 112,000,000 (not even 110 or your growth is stagnating).
So you get rid of removable batteries so customers have to toss their phones away more often, you gimp other feature, you spend more money on advertising than you did actually developing the product (read this bit several times until it sinks in how crazy it is, yet that's how we are with every major phone, every major movie, etc), and so on.
In 2016 RedLetterMedia did a breakdown of the movies that year, like top and bottom ten grossing movies. They stated that the advertising budget was the same as the production budget, unless they had knowledge of a different number.
I don't doubt that after 2020 the advertising budgets far outstripped the production budgets - multiple times; I am curious if that trend continues now, now that production isn't hamstrung by covid restrictions.
I'm sure everyone has seen this 100 times already but it really fits given modern advertising practice of every major company, especially in designing products to fit advertising plans.
There are also entire "industries" designed to shield people who want to find quality content from big 'A' advertising.
I love how he uses the word “craftsmanship”, something that he understood quite well (considering how close he was working with people like Bill Atkinson, Andy Hertzfeld, Burrell Smith, etc).
Today engineers have to put up a fight to do anything resembling craftsmanship.
Capitalism works this way because its customers, the investors, want it to work this way, because growth is how you get compound interest. Investors include anyone with an interest bearing bank deposit, a 401k, stocks, bonds, etc.
No growth means it would no longer be possible for an investment to appreciate.
I think of a similar thing when I see people complaining about how companies don't want to pay good wages. When you go shopping do you buy the $10 product or the $5 essentially equivalent alternative? Most people will buy the $5 one. If you do that, you're putting downward pressure on wages.
It's in your (purely economic) best interest for your wages to be high but everyone else's to be low. That's because when you're a worker you are a seller of labor, while when you're a customer you are an (indirect) buyer of labor.
Everything in economics is like this. Everything is a paradox. Everything is a feedback loop. Every transaction has two parties, and in some cases you are both parties depending on what "hat" you are wearing at the moment.
Growth isn’t necessary for high returns on equity. And it isn’t necessary for the investment to provide a return.
Equity returns ultimately come from risk premiums. (Which are small now in US equities BTW).
I’m invested in a microcap private equity fund that has returned >20-25% for years. They have high returns because they buy firms at 3-4x cashflow. You will get the high returns even with no growth. And with no increase in valuation. The returns are a function of an illiquidity premium.
With Apple explicitly, growth is expected given the valuation level. If it doesn’t grow, the share price will decline. So yes, in their case, firm is certainly under pressure to grow.
I also don’t agree with your “best interest for wages to be high and everyone else’s lower”. That is one aspect. It is more complicated. Consider Baumol Effect for starters.
I'm talking about macroeconomics, not micro. Risk premium means there is risk; not everyone gets a return at all. The entire society, as a whole, cannot experience consistent returns unless there is macroeconomic growth. If the pie is not getting bigger, someone has to be losing for someone else to gain.
Things like retirement, 401ks, etc., are society-wide institutions subject to macroeconomic rules.
The actual argument would be people voting with their wallets and moving away from the Apple ecosystem, but this something impossible at least in the USA due to these bullshit "blue bubbles"
For most of the people here they don't. In popular culture and especially among teens and non-technical twenty-somethings there's this absurd "eww green text!" thing. A blue bubble is a status symbol for some reason, even though there's lots of Android phones that cost as much as iPhones.
At this point this is not an argument anymore, it’s just a thought terminating cliche.
Expecting users to change their daily habits in order to marginally improve the operating system of a trillion dollar company feels naive and a bit disrespectful to people who actually use these machines for work.
Even developers… the vast majority of developers ignored Apple for decades (and Apple was also hostile) and it managed to grow despite that.
Might as well ask people to contribute to Gnome or whatever so in the future everyone can go somewhere better. Feels way more feasible.
But the opposite is assuming that Apple has a "responsibility" towards its existing users and has to acknowledge their expectations from them.
A sentiment which famously led Steve Jobs to respond that he doesn't understand this, because "people pay us to make that decision for them" and "If people like our products they will buy them; if they don't, they won't" [0]
So according to Steve Jobs himself, the only Apple-acknowledged way to disagree with Apple is to NOT buy their products, and by extend into the services-world of today it means STOP USING their products.
Now Steve Jobs doesn't officially run this company anymore, but I don't see any indication that this philosophy has changed in any way.
I don't think that's the opposite. The opposite is admitting that people have more than one reason to choose computers, and "voting with your wallet" only works for easily replaceable items, like groceries, clothing, etc.
Most people are not going to migrate to Android, Windows, Linux or whatever else just to make macOS marginally better.
And it's fine: marginal quality improvements of a product are not the "responsibility" of consumers.
I don't think you're taking their argument in good faith. At least my read on what's being said here is that the psyops lock-in effects that Apple uses are too strong.
It's not just "blue bubbles," but "blue bubbles" seems like a good shorthand to me. It's also things like Hand-off, or Universal Control, or getting Messages on both iPhone and Mac seamlessly, or being on the same WiFi network allowing your iPhone/Watch to work as a tv remote for the Apple TV even if you're just visiting a friend. Features that any platform can and does enable, but that do to Apples vertical can work seamlessly out of the box, across all the product lines, while securing network access in the ways most users will want, creating a continuous buy-in loop wherein the more Apple products you buy, the more incentive there is to buy exclusively Apple.
And it's a collective "you." If your entire family uses exclusively Apple products, then you'll be the only person who can't easily use eg the Apple TV in the living room, or the person "messing up" the group chats with "User reacted with Emoji Heart to [3 paragraph text message]," or the one trying to decide between competing network KVM software platforms so that you can use your tablet when your 12-yo can just set their tablet next to their laptop and get a second screen without any setup. Nevermind that these are all social engineering techniques that only exist BECAUSE Apple chose not to play nice with others, they still socially reinforce a deeper commitment to Apple products with each additional Apple product in the ecosystem.
I say this as someone "stuck in the blue bubble" with eyes open about what's going on. I'll keep picking Apple as long as they're a hardware-oriented company, because their incentives are best aligned with mine for the consumer features they are delivering (for now): consumer integration that sells hardware. It's insidious in its own way, but not like "hardware that sells eyeballs" (Google/Meta) or "business integration that sells compliance" (Microsoft).
> What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?
That their own products depend on it because they developer their products in Mac. And that the professional people they pretend they cater to depend on Macs, and steadily move away.
> Robustness should come from better languages that are safe by construction.
Nahh, robustness comes from the time you can spend refining the product not from some magic property of a language. That can help but just a bit. There was no Swift in Snow Leopard. Nor there is not much Rust in Linux (often none) and even less (none) in one of the most stable OS available, FreeBSD.
They should just release a new version when the product is ready and not when the marketing says to release it.
Desktops are in S3 half the day consuming ~0 power. During use, electricity costs are so much lower than hardware costs that approximately nobody cares about or even measures the former. Servers have background tasks running at idle priority all day so the power consumption is effectively constant. Laptop and phone are the only platforms where the concept of "Linux power management" makes any sense.
My Mac mini (M1) sips ~6W idle and is completely inaudible. It acts as a desktop whenever I need it to, and as a server 24/7. I only power up my NAS (WoL) for daily backups. The rest of the homelab is for fun and experiments, but mostly gone.
"Idle" x86-64 SOHO servers still eat ~30W with carefully selected parts and when correctly tuned, >60W if you just put together random junk. "Cloud" works because of economies of scale. If there's a future where people own their stuff, minimising power draw is a key step.
Does the mini PC go from zero to eleven though? Can I play BG3, Factorio, or Minecraft on the same hardware? Can I saturate a TB3 port? Transcode video? Run an LLM or text2img? Any of that while remaining responsive, having a video call?
If I already need a powerful machine for a desktop, why would I need a second one just so it can stay up 24/7 to run Miniflux or Syncthing? Less is more.
I want Mac hardware but Linux software. The other makers build quality is horrendous. Especially in the 13inch segment which is my favorite.
Using a pretty old laptop because there is no replacement right now.
The new Ryzen AI looks really interesting!
Sadly there is no Framework shop for me to look at it and they not ship to Japan..
Thinkpad line from Lenovo. Amazing build quality, and you can order them with Linux.
I have a P1 Gen 7 and it’s fantastic. It feels premium, and it’s thin, light, powerful, has good connectivity and 4K OLED touch screen. I’d take it over Mac hardware any day.
Aren't the only Thinkpads with displays in the 4k neighborhood 16-inches? The 14-inch Macbooks are 3024*1964 and have all been like that for a while. I don't know why the PC world (and Linux ready by extension) undervalues high DPI so much, because it makes it hard to consider going back.
The screen keeps me on macbooks as well (well, and the touchpad, the speakers, and the lack of fan noise).
But it is baffling how 1920x1080 (or 1200p) are still the "standard" elsewhere. If I want an X1 carbon, the best screen you can get at 14" right now is 2880x1800 (2.8k). Spec it with 32GB of RAM and it's clocking in at $2700, for a laptop that still has a worse trackpad, worse sound, and worse screen than a 14" MBP at $2399. And the Ultra7 in the thinkpad still doesn't beat the Mac, and it'll be loud with worse battery life.
There truly is nothing else out there with the same experience as an Apple Silicon MBP or Air.
So, my only options for the foreseeable future is wait for Asahi Linux, or suck it up and deal with macOS because at this rate I don't think there will ever be a laptop with the same quality (across all components) of the mac that can run Linux. The only one that came remotely close is the Surface Laptop 7 with the Snapdragon elite, but no Linux on that.
Non-Thinkpad Lenovos have some standouts too. I'm running Debian Stable on an AMD Yoga Slim 7 from a couple of years ago and sure, it's not an Apple, but for the £800 or so I paid for it, it's a really polished machine. Loads of ports, and it's approximately performance-competitive with a Dell XPS13 from about the same time that cost literally twice as much.
The one snag I ran into was that when it was new, supporting the power modes properly needed a mainline kernel rather than the distro default. But in the grand scheme of things that's relatively trivial.
I have an M1 Macbook Pro from work and honestly I'm not tempted to get one for myself. I am tempted by the M3 and M4 beasts as AI machines, but as form factors go I'm just not sold.
The biggest issue Framework have right now is shipping. I can order a ThinkPad practically anywhere. No so with Framework - they are literally leaving money on the table from what I would assume their core segment: affluent tech savvy users trying to get off the planned obsolescence cycle.
Tell me you're from the US without telling me you're from the US.
Jokes aside, I had to wait years for Framework to finally allow shipping via a friend in Berlin. I think they ship to Sweden now—they seemed to have an unfortunate misunderstanding that they needed to produce a Swedish keyboard and translate their website before shipping here, which of course is poppycocks.
I am pretty sure that if you have reached the point that you are ordering a laptop online from a brand unknown to the general public, it means you are past the point you need the actual physical keys to match your keyboard layout on your OS settings. You could just have blank keys.
Out of curiosity, what are you basing this on? From having spoken to people who manage IT fleets, and being the person regular people ask for advice for what device to get, with the occasional exception (which Apple also had plenty of, cf. the butterfly keyboard), you get what you pay for. A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.
> A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.
And it still won't be on par with a $999 apple silicon air, or a MBP.
I've deployed latitudes, precisions, and thinkpads. They all still make tradeoffs that you don't have to deal with on the mac.
The X1 carbon is probably the "best" but, even with that - you are still getting a 1920x1200 screen unless you spend more than a MBP for the 2.8k display (which is still less than the 14" MBP, and costs more than an equivalent specced M4 pro). The trackpad is worse, the speakers are worse, battery life is worse, and they're loud under load.
They're all fine for a fleet where the end user isn't the purchaser, which is why they exist, but for an individual that doesn't want tradeoffs (outside of the tradeoff of having to use macOS), there's no other option on the market that comes remotely close to the mac. For someone that wants Apple silicon MBP level hardware but wants to run Linux, there are zero options.
The screen is the most egregious tradeoff though, the PC world is still adverse to HiDPI displays and even on high end models 1080p or 1200p is still the standard. I can excuse poor speakers, it is a laptop after all, if I really had to I can deal with fan noise, but I shouldn't have to spend more than a MBP to get a decent 120hz HiDPI screen with sufficient brightness and color accuracy.
At work, our Windows devs use expensive XPSs that are complete crap failing constantly, both hardware and software. As someone who used Latitudes and Precisions when these were the reliable workhorses you seem to describe, the new stuff is just outrageous. (My personal laptop is still an e6440).
My work machine is an M2 Pro MBP and except the shitty input HW (compared to the golden era of Thinkpads/Latitudes without chiclet keyboards) and MacOS being quite bad compared to Linux, it completely trounces the neighbouring Dells that constantly need repairs (mostly the USB-C ports and wireless cards failing).
Maybe if you run a fleet that's statistically true. If you're a regular person you can have incredible bad luck with specific models.
Got two "2k" Lenovos at 4 year intervals.
The first one worked fine but that model was known to have a weak hinge. Had to replace it three times.
The second one had a known problem that some units simply stop working with the internal display and the only solution is replacing the motherboard. My unit worked about a week for me. Seller refunded me instead of repairing because it was end of the line and they didn't have replacements.
Got a "2k" Asus ordered now, let's see how that goes :)
Compared to that, even the one emoji keyboard macbook pro that i had worked for years. The keyboard on those models is defective by design and kept degrading, and I still think Cook should take his dried frog pills more regularly, but the rest of the laptop is still working. Not to mention my other, older apple laptops that are still just fine(tm), just obsolete.
I think price isn't the only thing. PC gaming/consumer laptops lean pretty heavily on price to performance ratios and I think they cut build quality to do it. Business lines like Thinkpad/EliteBook tend to offer worse performance dollar for dollar but they are built better.
Consider a thinkpad or lenovog yoga pro. I don't think the difference is is that pronounced anymore, maybe it never was, but you always need to look at the premium segment. Somehow people end up comparing budget pc laptops and macbooks.
Yeah, I heard good things about it. I do a lot of gamey development stuff and x64 makes that easier. But Asahi seems to be catching up a lot recently, maybe I should look at it again! https://news.ycombinator.com/item?id=41799068
Asahi is an adventure. I am in the same camp where I got a MacBook for the hardware, but am really a Linux guy. I got really excited when the fex/muvm patches came out for Asahi, and switched to mainly booting it for a couple months. 80% of what I needed to do worked, but that 20% still wasn't there. It was mainly the little things too:
1. Display output from USB-C didn't work
2. Couldn't run Zotero
3. Couldn't compile Java bioinformatics tools
4. Container architecture mismatches led to catastrophic and hard-to-diagnose bugs
There were things that worked better, too (better task management apps, and working gamepad support come to mind). Overall, even though I only needed those things once or twice a week, the blockers added up and I erased my Asahi partition in the end.
I really appreciate the strides the Asahi project has made (no really, it's tremendous!), and while I would love to say that Linux lets me be most productive, features like Rosetta2 are really integrated that much better into MacOS so that I can't help but feel that Asahi is getting the worst of both worlds right now. I'll probably try again this summer and see what has developed.
What dou you mean with more integrated? It is a regular desktop PC with an apu (like is totally common for office PCs, just bigger) and soldered instead of upgradeable ram.
It would be kind of funny, but also very sad, if Apple guys mistook the copying of apple's worst behaviour - producing throwaway devices - as a sign of quality. Though I think we are there for years now with phones, I wouldn't expect such thinking here.
It is "integrated" in the way that the processor is an APU that has specific memory bus requirements. That's all. It is not an integrated software-hardware system that is finetuned, and that board is not any better than a a generic motherboard would be for a regular processor.
My point is that this system is not integrated in the way apple fans usually define the word. I'd claim it is not integrated at all. It is a regular PC (but with soldered ram), which is exactly like framework announced it.
There should be no need to sprinkle some apple marketing bs on that to make it attractive.
I really wish everyone would stop entertaining these borderline crackpot hypotheticals that all rely on the notion of “those damn Apple dummies not getting it!”
As someone who actually studied human computer interaction, and since I had to work with borderline unuseable macs multiple times in my career now, plus as someone seeing the utter failure of relatives in just using an iPhone (bought since "it is so much easier", now not even able to call from the car system since it is so buggy), the Apple popularity is absolutely a case where you have to look at external factors like social status. And if that translates to "the users are dummies" to you, then that's your interpretation. Plus yes, translating marketing/status concepts like a bogus "integrated" status absolutely is interesting, thus my intent to clarify whether that is really happening here (plus some criticism, admittedly).
Probably not worth it going further into this though, it will only derail.
As a former Mac user, I'm really happy with my System76 linux laptops. The only tradeoff is the terrible built-in speakers. My Lemur is lighter and has better battery life than my Macbook Air and has been bulletproof despite my ill treatment. Each of my Macs, however, have had various hardware failures or the famous keyboard recall on the horrible touchbar Macbook Pro. I also prefer matte screens to glossy, so that's a win for me, but ymmv.
* Battery life is a lie, especially since it drains almost as much battery closed as it does open.
...
Overall, I think I am probably going to switch back to a macbook after this, not being able to go a day without charging and your laptop always being on low battery is a bit anxiety inducing.
This really is exactly how I feel. There are too many tradeoffs to switch to non-Apple hardware at this point. I'd love to run Linux/BSD full-time, as many of the apps that I frequently use on my Mac are FOSS (e.g., R, PyCharm, darktable, etc.) I've been a Mac user since 2002, and Mac OS X served as my gateway to the Linux/BSD world (that, and a short-lived use of RH 6.2 on an old Dell laptop). IMO, macOS really does need a Tiger/Snow Leopard-esque release, but I'm not sure the vast majority of macOS users would even appreciate such a release.
Not quite what you're after but if you want a fanless option that runs full linux and doesn't use much battery, the new argon 40 CM5 laptop that's being built looks like it could be viable as long as you'd be happy with that much of a drop in performance and a few pi based niggles (No USB C video, only one pcie lane for the SSD, etc.)
Your loss. I haven't been able to tolerate the MacOS experience since Catalina, running GNOME with a Magic Trackpad has felt head-and-shoulders better for the past 3 years at least. Apple Silicon is neat but was never an option for native development in my workflows anyways. The software matters more to me, and MacOS has been sliding down the subscription slopware slope for years now.
I am perfectly happy to use last-gen hardware from Ebay if it runs an OS that isn't begging me to pay for subscriptions and "developer fees" annually. My dignity as a human is well worth it.
The reason that keeps me on Windows, is that you left out of your list gaming and 3D graphics on laptops.
Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming, the M chips aren't up to NVidia ecosystem, SYCL, the two major compute APIs for any kind of relevant GPGPU workloads, and thus don't really matter.
And gaming, well, even though all major engines support Metal, there is a reason DirectX porting kit is now a thing.
So why pay more for a lesser experience, and then there is the whole issue macOS doesn't have native support for containers, like Windows does (their own ones), and WSL is better integrated and easier to use than Virtualization Framework.
Gaming is pretty great on Linux now. I just finished a little Elden ring session and it still blows my mind that when I close the game my Linux desktop is there behind it. No more dual booting, hopefully will never need windows for anything ever again.
You mean translating Windows and DirectX APIs is great, there is hardly a Linux gaming ecosystem.
Proton is the acknowledgment of Valve's failure to entice game studios, already targeting Vulkan/OpenGL ES/OpenSL on Android NDK, Switch (which has OpenGL 4.6/Vulkan support), or on PlayStation (Orbis OS being a FreeBSD fork) to target GNU/Linux.
There’s no such thing as “native”, all the things you’re talking about are translation layers for hardware instructions themselves, and the overhead for doing software based translation is significantly less than hardware accelerated virtual machines- and we as an industry love those.
The reason for this is because the translations are very cache friendly and happen in userland, so the performance impact is negligible, and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows.. Which is crazy when you consider the difference in quality of the GPU drivers.
I understand that you want it to “just work”, but that tends to be the experience anyway.
You can do what you want, it’s your life, but this is not a terribly good excuse. Valves “failure” is essentially rectified.
I will add though, that it’s actually Stadia that made linux gaming the most feasible, many game engines (all of the ones I worked on) were ported to Linux to run Stadia, those ports changed essential elements of the engine that would have been slow or difficult to translate; so when Proton came around quite a lot of heavy lifting had gone away. I only say this because Valve gets some credit for a lot of work our Engine programmers did to make Linux viable.
> and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows..
I play most of my games in a window and switch away a lot. A million years ago when I was still playing world of warcraft, the system overall was much more responsive on the same hardware with wow on wine on linux than with wow natively running on windows :)
> it’s actually Stadia that made linux gaming the most feasible
Stadia was the most predatory gaming offering aside from IAP games, sorry. Buy your games again on top of the subscription? Lose them when Google cancels the service? No thanks.
Nvidia's GeForce Now was a lot more honest. Pay for the GPU and streaming, access your owned games from Steam. I'm not using it any more so I don't know how honest they still are, but I did for like a year and it was fine(tm).
The fact that Stadia advanced wine compatibility is great, but technical reasons aren't the only reasons that make a service useful to your customers.
OP is talking about Google (Stadia) throwing money at the problem and incentivizing game engine companies to better support Linux. They’re not talking about pro or anticonsumer the tech was.
So how many of those ported game engines are actually making a different on GNU/Linux gaming today?
There is certainly such thing as native, one thing is the platform where the APIs were originally designed for and battled tested, and the other is other platform emulating / translating them, by reverse engineering their behaviours with various degrees of success.
Valve's luck is that so far Microsoft/XBox Gaming has decided to close an eye on Proton, and it will run out when Microsoft decides it has gone long enough.
Specifically, I worked on those games so I know what they natively support and how things transpired behind the scenes.
Proton has absolutely no hope of working without the changes we made because of stadia, the code we wrote was deeply hooked into Windows and we made more generic variants of many things.
The Division 1 PS4 release was significantly shimmed underneath compared to the win32 and xbox releases: this became much less true over time as porting the renderer to linux (specifically debian) made us genericise issues across the OS’s and when Div2 shipped we had a lot more in common across the releases; we didn’t rely on deep hooks into Microsoft APIs as much
> this became much less true over time as porting the renderer to linux (specifically debian)
Strange how you ported the renderer to Debian, and yet you couldn't even find a link to a game that has a native Linux support.
Was there ever a port?
> Proton has absolutely no hope of working without the changes
You keep saying this as the absolute truth, and yet at the time when Stadia launched Proton already had 5k working games under its belt.
Strange how Stadia is this monumental achievement without which Linux gaming wouldn't happen according to you.... and yet no one ever mentions Stadia ever contributing any code to any of the constituent parts of what makes Proton tick. Apart from the changes that engines supposedly made to work on a yet another game streaming platform.
I don’t know how to say this without being unkind.
There is a functioning version of The Division 1, Division 2, Avatar and Star Wars outlaws that run on Linux internally at Ubisoft.
Nobody will release it because it can’t be reasonably QA’d. (Stadia was also very hard to QA, but possible, as it was a stable target and development was essentially funded).
I’m not sure what your problem is; I said - as clearly as I can - that architectural changes to the engine were neccessary for proton.
I know this, for an absolute fact, because Proton was a topic when I worked on those games and it was not until Stadia (codename Yeti) was on the roadmap, and our rendering architect lost all his hair working on it - that Proton started to even function slightly.
I’m not shilling for Stadia - there’s nothing to shill for, it is dead.
Get over yourself, if you don’t like the truth then don’t start going in on me because my reality does not match your fantasy. Sometimes corporations do things accidentally that push other things forward unintentionally.
I just want to share my thanks to Stadia because I know for a concrete fucking fact that at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.
All I'm saying is that "it’s actually Stadia that made linux gaming the most feasible" statement is at best contentious because in reality gaming on Linux was already made (more) feasible when Stadia had only just launched.
And Stadia used the same tech without ever giving back to Proton at all (atl least nothing I can quickly discover). So the absolute vast majority of work on Proton was done by Valve which you dismissed as "when Proton came around" (it came around before Stadia) and "quite a lot of heavy lifting had gone away" (Valve did most of the heavy lifting).
That's the extent of my "problem".
> at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.
So, not "actually Stadia that made gaming feasible on Linux" but "because Stadia used all the same tech, and there were possible commercial incentives early on until Google completely dropped the ball, bigger studios also invested in compatibility with the tech stack"
Stadia did a lot to help by being a stable target and by being seen as commercially viable. Google also helped a lot to aid developers, not just financially.
That they didn’t contribute code to proton doesn’t factor at all, I just hate to see people not get their dues for their part in the prolification of Linux gaming- because I saw it first hand.
You are labouring under the delusion that I’ve implied Proton did nothing, no, they levied a lot of existing technology and put in a lot of polish. They were helped by Stadia, by Wine, by DXVK and others.
They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.
Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.
That proton was running some games is a weird revisionist take, very few AAA games ran at all, those that did were super old and there was always some crazy weird bugs- proton got better but also AAA games coalesced into conforming to linux-y paradigms underneath better- so support got better much quicker than expected. You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.
Hope that helps, because honestly this conversation is like talking to a brick wall.
> They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.
Oh, you very much minimised their contribution. From "when Proton came" (again, Proton came before Stadia) to "Stadia made gaming feasible on Linux" (when Proton made it feasible before Stadia)
> Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.
So, Stadie games were Linux ports. But as a result of this there are still literally no Linux ports. None of the tech behind Stadia ever made it back into software behind Proton. And "native stadia ports" are somehow responsible for more games that target Windows and DirectX to run better via Proton
> That proton was running some games is a weird revisionist take
Funny to hear this coming from a revisionist. I literally provided you with links you carefully ignored
--- start quote ---
A look over the ProtonDB reports for June 2019, over 5.5K games reported to work with Steam Play
> You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.
Or because the actual heavy lifting that Valve did with Proton paid off, and not the nebulous "native ports" and code that never saw the light of day.
> because honestly this conversation is like talking to a brick wall.
Unreal can target Linux, sure, but not all of the plugins you might use will, nor any of your own plugins.
Unreal is almost worse because their first party tools (UGS, Horde) will not work on Linux, so you have to treat linux as a console, and honestly the market share isn't there to justify it.
Speaking from experience, Helldivers 2 and Monster Hunter Wilds both ran better on Linux from day one before any special fixes and still do - I'm not sure what "original design and battled testing" is worth or good for if the underlying Kernel and/or OS is a mess.
Stadia's impact on gaming in general is next to zero. And given that the vast majority of gaming on Linux is happening via Proton, its impact on gaming on Linux is similarly next to zero.
What games have you made to justify this statement?
I worked closely with productions using proprietary game engines, I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.
That you don’t see it as an end user, is exactly my point.
You don't have to be a chef to judge what's coming out of the kitchen.
What is the objective impact of Stadia which at its height had a whopping 307 titles [1]? At the time of writing ProtonDB lists 6806 titles as "platinum, works perfectly out of the box" and 4839 games as "gold, works perfectly after tweaks". Steam Deck alone has almost 5x the number of games with "verified" status [2].
What games are being made for Linux thanks to Stadia, and don't just target DirectX and run through Proton? How many Stadia games were ported to Linux thanks to Stadia?
Also, to put things into perspective. Proton was launched in 2018. Stadia was launched in 2019.
In 2019 there were already over 5000 games that worked on Proton. [3]
In 2022 there already were more games with verified status for Steam Deck than there were games for Stadia, and 8 times more games verified to work by users [4]. Stadia shutdown was announced half a year after the article at [4].
Stadia had zero impact on gaming in general and on gaming on Linux in particular as judged by the results and objective reality. Even the games you showed as examples don't support Linux, only target Windows, and are only playable on Linux through Proton [5]
> I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.
> That you don’t see it as an end user, is exactly my point.
It's strange to claim things like "when Proton came along" when Proton was there before Stadia and already had over 5k games working in the year when Stadia only just launched.
It's strange to claim outsized impact on development process when there are no outcomes targeting anything even remotely close to Linux development, with studios targeting Windows as they have always done.
It's strange to claim Stadia had outsized impact when none of the work translated into any games outside Stadia. When Stadia did not contribute any significant work to the tech that is running Proton. In 2022 they even started work on their own emulation layer that went nowhere and AFAIK never contributed to anything [6]
It's strange to claim that "it's actually Stadia that made Linux gaming feasible" when there's literally no visible or measurable impact anywhere for any claim you make. Beyond "just trust me".
You are literally arguing that your ignorance is as valid as my experience. And you’re arguing that you didn't see the impact; which was kinda my entire point - there was impact beyond what was visible that propelled Proton forward.
You don’t know how the sausage is made just because you ate a hotdog.
Maybe you should consider things more carefully before making yourself look like an idiot on the internet and simultaneously raising my blood pressure.
Strange take. Proton is an acknowledgment that the windows apis are the de facto for gaming. Not sure why the runtime matters. Some games ever run better. Not sure why that’s not the ”real deal” but whatever I’m glad you’re happy with your spyware gaming OS.
If you're playing the likes of Fromsoft/Resident Evil/Kojima games on a PC, be it Windows or Linux, you're not playing on the platform those games were designed for.
The problem with your reasoning is that Windows/PC doesn't need to emulate Orbit OS and LibGNM, Sony also supports Direct X and Win32 directly on their engines.
"supports" as in I see articles in the PC gaming press about technical problems with From/Kojima games a year after I've finished said games on console with zero issues.
It's not really a failure.
Linux distribution and diverse ecosystem brings a level of complexity.
The only way to support it long term is to either having your team continuously update and release builds of the game to cater for that which is an impossible task to ask for a lot of studios.
The initial approach of runtimes did help but it's still has its limitation.
If now a studio just need to test their game under a runtime+proton the same way they would test a version of Windows to ensure it's working under Linux it's a win/win situation. Proton becomes the abstraction of the complex and diverse ecosystem of Linux which is both its strength and weakness.
Another solution would have been everybody using the exact same distribution which would have been way worse in my opinion.
And who knows, maybe one day Proton/Wine would be the Windows userland reference and Windows would just be an implementation of it :D
That’s not the goal though. The goal is to play games on Linux. If Valve’s goal was to end up with a Linux-specific graphics api for most games that run on Linux then they provably would have tried to do so.
When I was gaming on Linux, every game with a native version worked better using the Windows version in proton. I think the only exception was Factorio.
Gaming/WSL kept me on Windows for a lot of the last decade, however after Windows 10 became EOL'd and Windows started turning into ad/spyware I finally gave it up over a year ago after 25+ years on
Windows Desktops.
Anyway Linux is liberating, Fedora Desktop is great, no ads in the OS, a Software Store/Installer I actually like to use, curated by usefulness instead of scam Apps. All my Windows Steam Games I frequently use just worked, I have to login to X11 for 1 title (MK11), but everything else runs in the default Wayland desktop. Although I'll still check protondb.com before purchasing new games to make sure there'll be no issues. Thanks to Docker, JetBrains IDEs and most Daily Apps I use are cross-platform Desktop Web Apps (e.g. VS Code, Discord, Obsidian, etc) I was able to run everything I wanted to.
The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh that's enhanced with productivity tools like fzf, eza, bat, zoxide and starship. There's also awesome tools like lazydocker, lazygit, btop and neovim pushing the limits of what's possible in a terminal UI and distrobox which lets me easily run Ubuntu VMs to install experimental software without impacting my Fedora Desktop.
Image editors is the one area still lacking in Linux. On Windows I used Affinity Designer/Photo and Paint.NET for quick edits. On macOS I use Affinity & Pixelmator. On Linux we have to chose between Pinta (Paint.NET port), Krita and GIMP which are weaker and less intuitive alternatives. But with the new major release of GIMP 3 and having just discovered photopea.com things are starting to look up.
I hardly find anything interesting about command-line, I grew up in a time where the command line was the only way to interact with home computers, it fails on me the interest on staying stuck in up to early 1980's computing model.
Xerox PARC is the future many of us want to be in, not PDP-11 clones.
Weird flex, most commands, utilities, server software and remote tools are going to be run are going to be from the command-line. All our System Administration of remote servers uses the command-line as well since exclusively deploying to Linux for 10+ years.
Sure you can happily avoid the command-line with a Linux Desktop and GUI Apps, although as a developer I don't see how I could avoid using the terminal. Even on Windows I was using WSL a lot, it's just uncanny valley and slow compared to a real Linux terminal.
> Weird flex, most commands, utilities, server software and remote tools are going to be run are going to be from the command-line.
It's not a weird flex. Weird flex is this: "The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh" and then listing a bunch of obscure personal preference tools that follow trends du jour.
That’s not a flex, it requires no skill to install software, they’re just some of the better tools you can install to boost productivity in Linux terminals. I doubt they’re obscure to any Linux CLI user who spent time on improving the default OOB UX of bash terminals.
And you just alias them, so you can keep using the core utility names to use them.
There are huge interoperability advantages to CLI and TUI tools. Composing them, using script(1) on them, etc, are much simpler than the same for GUI tools. They are also much easier to rapidly iterate on.
GUIs are very useful but they are not clearly better (or worse) than CLIs.
Gaming on windows is fine, but there's no reason to use windows for anything else. Dual boot to linux for a better desktop and none of the crud that Windows 11 has in it.
I haven't been gaming since when there was huge gap between graphical possibilities and actual design (that is beginning of 3d era) - so I do not miss that. However I can see the decline in macOS, like pushing for 'apple intelligence', more and more restricting gatekeeper, iOS-ification of desktop (ie.: mentioned system settings), constant connections to AWS, etc.
But since I'm not gaming I cannot imagine going back to Windows. On the other hand I'm quite enjoying Linux...
> So why pay more for a lesser experience
...however, with few exceptions, I haven't used mouse in decade... and I haven't found anything like MBP's touchpad yet. Maybe I just need to do better research.
My ideal laptop would be the macbook trackpad, monitor and battery life stuck inside any thinkpad. Or just anything non MacOS, even Windows, in the macbook. I despise MacOS with every fiber of my being, but the hardware is damned good.
Apple's software quality (either in terms of polish or just plain QA) has steadily decreased
I think the decline of software went hand-in-hand with the decline of the native indie Mac app. They still exist, but when I started with the Mac (2007), there was a very rich ecosystem of native Mac apps. Most stood head and shoulders above their Linux and Windows counterparts.
Apple has nearly destroyed that ecosystem with: race-to-the-bottom pricing incited by the App Store; general neglect of the Mac platform (especially between ~2016 and Apple Silicon); and a messy reactionary toolkit story with Catalyst, SwiftUI, etc. The new toolkits seem to imply that Apple says that it's the end of AppKit, but most SwiftUI applications are noticeably worse.
With their messy toolkit story and general neglect, developers have started using Electron more and more. Sure, part of the popularity is cost savings, since Electron apps can be used on multiple platforms. But part of it is also that a Catalyst or SwiftUI app is not going to provide much more over an Electron app. They will also feel weirdly out of place and you become dependent on Apple working out quirks in SwiftUI. E.g. 1Password tried SwiftUI for their Mac app, but decided in the end that it was an uphill battle and switched to Electron on Mac instead.
I recently bought a ThinkPad to use besides my MacBook. Switching is much easier than 10 or 15 years ago, since 80% of the apps that I use most frequently (Slack, Obsidian, 1Password, etc.) are Electron anyway. Even fingerprint unlocking works in 1Password. I was vehemently anti-electron and still don't like it a lot, but I am happy that it makes moving to a non-Apple platform much easier.
I think most of this is just downstream of the Mac being eclipsed by the iPhone in terms of Apple’s revenue. The Mac just isn’t critical to Apple’s business like it was in 2009 when Snow Leopard came out. They would have started development on SL in 2008, when the iPhone was still a fairly niche product and there wasn’t even an App Store.
Now, ios gets the executive attention and it will generally get the best developers assigned to it, and the Mac has to live with the scraps.
Yeah I think this is the one, in terms of number of users, revenue, etc. The iPhone is more than 50% of their revenue, Mac is only ~8. Lower volume and higher price, but it doesn't come anywhere near their phone. Same with tablets, although they share an app revenue income stream with the iphone which makes up for the difference in hardware sales.
> I recently bought a ThinkPad to use besides my MacBook.
I'm on the same boat here. Something is driving me away from my MacBook M1(Pro? Don't even know). I have a gut feeling that it's macOS but can't really put a finger on it yet.
Bought a heavily used ThinkPad T480s (from 2018) and replaced almost every replaceable part of it, including the screen. Being able to replace many parts easily is a nice touch since I am using MacBooks since 2007 exclusively. Guess that's why I somehow overdid it here.
Slammed Pop!_OS 22.04 on it and I'm very pleased with the result. The first Linux desktop I actually enjoy since trying SuSE 5-something.
Pain points are teams (running in browser), bad audio quality with AirPods when using the microphone and cpu speed and heat. I guess one has to stop using Apple silicon in laptops to realize how amazing these processors are.
Intel CPUs from that era were quite bad and everyone has upped their ante since then. I was thinking about getting a second hand from ~2021-2022, but my wife convinced me to get a new one, so I got a Gen 5 T14 AMD. It has a Ryzen 7 Pro 8840U and I rarely hear the fans, mostly only when Nix has to rebuild some large packages (running NixOS unstable-small).
1Password had a beautiful native Mac app that works to this day. Even assuming SwiftUI is actually bad, why did they have to migrate at all? What was wrong with the existing app?
I'm not disagreeing with the opinions on Apple software quality, but I think the 1Password case is more down to their taking of VC money and having to give (JS) devs some busywork to rebuild something that worked perfectly well.
1Password is also now subscription only and online only. Gone are the days of a forever license and fully offline encrypted database allowing for 3rd party syncing via iCloud or others. The death of their old app went hand in hand with their race to the bottom subscription payment VC backed ecosystem. It's only time until they suffer a breach like everyone else.
Regarding Spotlight, one thing that started happening for me on Sequioa was that Finder and other apps started getting very slow to react to file changes. For example, I can save a new file to a directory, and the Finder window takes maybe 10-20 seconds before the file shows up in the list. If I navigate to a different folder and then back, the file is there. I notice the same delay in apps like IntelliJ.
I could be wrong, but apparently Spotlight is the service that drives this kind of file system watching. I think macOS has a lower-level inotify-style file system event API, which should be unaffected, but Finder and these other apps apparently use Spotlight. I really wish I had a fix, because it's just crazy having to constantly "refresh" things.
My favourite feature is when spotlight tells me that indexing is paused when I am searching for something.
You went through the effort to show some UI when something I am looking for may not be there because indexing is paused... but you didn't think to just unpause the indexing so that I can find it? I feel like I am being spit on, "Yeah, you not finding what you are looking for? I know, I'm not even trying"
I highly recommend using Alfred. I’ve been using it since before Spotlight came out, tried and then disabled Spotlight, and went back to Alfred. It’s extremely configurable but highly usable out of the box. Sort of like creating your own CLI shortcuts to open files, apps, copy things to the clipboard, etc.
I still use Quicksilver[1], the open source app that long predates Alfred and was the inspiration for it. I tried Alfred a few years ago but didn't see anything compelling enough to switch. Am I missing anything?
This KILLS me. It's so frustrating. APFS is supposed to be great at deduping files and such, but in practice it seems like it really sucks. It's bad at both saving a file to the desktop and dumping a million npm files into a directory.
Same here. Spotlight used to be my everything, i.e. I never use the dock I would always use spotlight to launch applications or navigate to folders. Now it is littered with internet garbage, takes seconds to even return any results, and the results are always useless.
Who the hell thought integrating internet search is a good idea - because "aösldkfjalsdkfjalsdkfj" just as everything else is a valid search result in Spotlight now showing me "Search for aölsdkfjöalsdfjasdlfkj in Firefox"...
Spotlight was never useful, because of an absurd and glaring design defect: It doesn't show you WHERE it found stuff. There's no path shown with hits. Same blunder in Finder's search, and you can't even optionally add "path" as a column. WTF.
So... when the hits include six identically-named files, you can't eliminate ones that you know are wrong (on a backup volume or whatever). The level of stupidity here is just mind-boggling.
Where? And how is that option displayed to the user?
I also just tried it in Spotlight and Finder, and it did nothing. Which I consider a relief, because undiscoverable bullshit is worse than the feature not existing.
macOS and iPadOS are full of those undiscoverable "if you do this combination of buttons/swipes while at full moon, something happens". As a Mac user not by choice (work issued) I hate how impossible to discover these are.
As a Mac/iOS/iPadOS user it seems that it’s almost mandatory to watch each Keynote / product announcement video if you want to keep up with new features. Lots of cool features that I only knew about by watching those videos that are completely undiscoverable otherwise.
These kinds of shortcuts are part of Apple software as a whole, and apparently have been a thing since at least OSX. These behaviors were supposed to be covered in the documentation, but I don't know how true this is nowadays.
Special mention to all text input fields in macOS having Emacs-style shortcuts.
It goes back further than that. I remember being able to buy key-combo cheat cards for System 7, and I have no reason to think the shortcuts they covered wouldn't also have been present in System 6.
I agree that discoverability could be better, but macOS has pretty consistently had hidden power user shortcuts and modifiers, to keep the basic workflow streamlined/simple for those who don't need it.
Seeing where stuff is in a search function is not a "power user" feature; it's the whole point of what you're doing.
And I don't buy the "keeping things simple" excuse for secret hotkeys in other areas. Falling back on gimmicks like undisplayed hotkeys and "long presses" and "gestures" is lazy abandonment of the design task.
I hate this "saving the user from complexity" lie. It's hypocritical: The "non-power" user isn't going to go looking for these options in the first place.
Finder search is a great example. A "non-power" user isn't going to right-click on the column headings in the results and try to add "path" as a column. So how does it help that user to deny everyone else the ability to add it?
Apple mocked IBM for needing a thick user manual back in the day. To suggest that anyone (especially anyone on this site) should have to read documentation to use perform a basic file search (in a GUI, no less) is apologism to the extreme.
In all fairness, you do need to hold down the command key to show the file location in Sequoia. It is an interesting default behavior to pretend the files location doesn't exist, mobile-centric.
No you don’t. In Finder search results, the path is always shown at the bottom. For regular Finder windows, you can optionally show the path with “View -> Show Path Bar”
In all fairness, secret hotkey BS may as well not exist. Are you supposed to mash every modifier key and every combination thereof on every screen and in every menu, looking for hidden goodies?
we have simplified the interface to just one home button and the screen interface, as well as the volume up/volume down key.
To select, just press on the item.
To hover, press and hold for at least 2 seconds.
To get a list of options, press and hold for at least 2.5 seconds, but not more than 3.5 seconds.
To delete, press and hold for 3.6 seconds, but not longer than 3.9 seconds.
To save, press and hold for 4.1 seconds. Pressing and holding for exactly 4.0 seconds activates the archive action. Pressing and holding for 4.2 or more seconds sends the item to the blocked list.
To retrieve the list of items in the blocked list, press and hold and simultaneously press the volume up and volume down key.
To delete all items in the block list, press and hold and simultaneously press the volume up key only.
To completely reset your device, press and hold and simultaneously press the volume down key only, whilst holding the device in a completely vertical plane, and rotating clock-wise and counter-clockwise, smoothly, at precise 2.35619 radians every 30 seconds.
To trigger the emergency call feature, drop the device at an acceleration of no less than 9.6m/s and no more than 9.7m/s
No: you are supposed to read the documentation to learn about power user features. Microsoft also doesn’t shove the advanced keyboard shortcuts in your face; you need to read the manual to learn stuff like this.
Is it, though? Most people don’t really have a notion of the file system, or hierarchical file structures. They drop files onto their desktop, or keep them in the downloads folder. Just ask a parent or your next-door neighbour.
That’s a bit of a problem when discussing problems of normal users with power users, because they don’t even realise how what they’re doing is actually not what normies do.
I’m inclined to agree that hotkeys in MacOS are hard to discover, but cluttering the interface with stuff many users simply do not need cannot be the correct answer.
Spotlight is unbelievable bad, especially on iOS. If I type a substring of the name of an installed app, it should find it effectively instantly (say, within 1-2 frames of the input showing up). Instead, it finds it sometimes. On occasion I need to hit backspace (removing a letter that should match) to get it to find it.
I struggle to imagine the software design that works so poorly.
I've yet to find a decent implementation of search-as-you-type anywhere, not just Spotlight. I have that same issue on Firefox, and with Windows Search, for example.
And it makes no sense whatsoever. If "foo" matches "foobar", so should "foob". I honestly don't know how the hell can they still f up such a simple piece of technology in 2025.
Windows 7 start menu search was always reliable and had predictable behavior from my experience. It can be done, just that modern software engineers' skills and career incentives no longer permit it.
Wow, I feel like I almost could have written this except I prefer Plasma/KDE to GNOME. I use Linux + Mac laptops somewhat interchangeably since 2012, and have also seen the marked decline in quality. In fact, it seems like Linux has gotten better at almost the same pace (or maybe a bit faster) than macOS has gotten worse.
The things that most frustrate me about Macs is that they've violated the never spoken but always expected "it just works" in so many ways. Things like how Thunderbolt Displays containing a USB hub which are Apple-certified handle re-connection to a Macbook, should "just work", but require fiddling every time. That's just one of numerous examples I could come up with.
Apple historically was probably the best company in the world in understanding the full depth of what "User Experience" means, and it seems like they've really retreated from this position and are regressing to the mean.
I've been using Macs since Mac OS 9, and Snow Leopard was indeed very good. It remains my favorite version of Mac OS. I actually think it was Snow Leopard that started the rush of developers to Mac as _the_ platform to use.
People don't want animojis, and they don't want other trite new features that only seem to exist because Apple feels it needs to demo something new every year.
What they want is something that just works without annoyances, distractions, failures, or complications.
Give them that and they'll break down the doors trying to get their hands on it, because it's so far from how most tech works today.
Animojis really feel like peak corporate board asking, "What do the kids like these days?" and dumping that shit into the world. Honestly ... AVERAGE age of the Apple board is 68!! This is a company that's reached some sort of corporate red giant stage where it's influence is massive but it's ability to grow is over and it's only real purpose is to generate heavy metals and seed them throughout the rest of the universe after it's eventual explosive death.
Why would it be bad business for Apple? Their business model is based on selling a holistic ecosystem. They don’t have any need to chase new features and there steady stream of high margin hardware revenue is at stake.
Spotlight straight up broke on both of my Macs after Sequoia. It can't even find exact matches in many directories marked for indexing and re-indexing did nothing. Just searching for apps under Applications doesn't seem to find all apps.
I’ve had so many issues with it as well! To the absurd level where I could not search for settings in the Settings app… People all over the net have had all kinds of issues and there’s never been any help other than „oh go and reindex”.
iOS has this problem as well. You search for a setting in the Settings app. It’ll say “doesn’t exist” (or whatever) while it’s looking for something extremely obvious (like “software update”) instead of just showing a processing icon.
Then when it does show the results, they’re usually in some terribly unhelpful order. It took me ages to try and go through the CUJ of “this app isn’t sending me notifications because I turned them off now I want them back on”
Just yesterday I was trying to find a file in Finder, using the search, and it could not find it even though I was just one directory up from the directory it was sitting in. It made no sense to me at all. Reading these stories, it’s clicking for me.
I gave up on it because of this and installed Raycast which seems a lot more reliable. I used Spotlight effectively as my launcher for apps/settings, and have the Dock completely hidden and Spotlight set to hide everything else. But when it can't even do that consistently, I have no idea how!
I can't believe I'm saying it, but I agree with you about GNOME being my forever desktop. I used to really make fun of GNOME during the 2->3 transition, which seemed so profoundly misguided, but now I love it. I don't know if they've massively improved it or if my perspective has just changed with time.
Unified button that disguises as two different icons hiding other useful options
You can only cycle windows in one direction even if you try to do some remapping
Choosing keyboard languages hides a lot of options. Once you understand you need to click on English US to see more detailed options then you get them all, UK, Canadian...
Then it's unclear which keyboard layout is currently selected and how to select one from the list you made.
I can't fathom how a DE whose is all about human machine interface guidelines whatever and supposed to be the epitome of UX can't figure out basic stuff about discoverability and clarity
True ! So why can I only remap cycling window in one direction and not the other ... ?
The volume and power icons on the top right is actually one button and hides other option like screen lightning volume and wifi etc.
If at list they had made a three vertical dots/stacked bars and is the convention for hamburger menus...
From what I heard GNOME devs do not like change and it sucks to be a GNOME extension developer, a quick google search seems to confirm that so it casts some doubt about them up-streaming any of them but maybe you know better. Has it ever happened to other extensions ?
Haven't really used MacOS or iOS more that five minutes so I can only trust you on that.
On the other hand for example, it is very easy to remap CapsLock to Escape on MacOs. Just go to Setting --> Keyboard and you easily find the option. GNOME ? No, not in settings. Wait I have to use an app called gnome-tweak ? Ok it's in "Advanced keyboard otions" --> Opens a big list of loosely classified options. Oh well it was in miscellaneous category.
I can believe that its easy to bounce off software because of a million paper cuts. But the problem with them trying to address every one of those proactively is that GNOME is a huge undertaking and they do their best to move at a fairly slow pace (now, after the 3 transition, which was akin to ripping a bandaid off ... go fast, piss the person off, but then the bandaid is gone).
I don't know if the CapsLock -> Escape switch is on a roadmap somewhere, but that is a little bananas. That said, my partner comfortably uses GNOME every day to browse the web and manage some files. Has she EVER wondered how to remap CapsLock? No. The people who do want to? Google can give you the answer pretty quickly. Not saying it's good UX, but GNOME balances a lot of use cases, and as this thread suggests, I think they've actually (with a LOT of complaining from engineers and power users) kept that balance pretty damn well to the point where I haven't been surprised by GNOME is a long time, and seems to slowly and progressively get better.
And yes, whoever jumps in here with their own papercut story, I know there is pain in not being the primary audience for software. But honestly, at least I'm in the same Venn diagram with my partner. The primary audience for macOS or iOS now appears to be ... I don't even know anymore. Used to be content creators, now it seems like even Apple doesn't actually know who uses their computers.
It's not just you, the early GNOME 3 releases sucked. It has seen a lot of gradual improvement over time. Of course there are reasonable alternatives also, such as Xfce, MATE or Cinnamon. (And these three 'alternative' desktops have also edged closer over time, sharing more and more of the underlying tech stack even as GNOME itself has continued to develop in a rather seperate direction).
Did you know, you can set your wallpapers to be continuously updating and make macs use terabytes of your network in hours or days depending on speed?
https://discussions.apple.com/thread/255329956
My biggest annoyance with recent macOS versions that most QuickLook plugins stopped working. Apparently one could re-develop them with their new framework-of-the-day, but I have no doubt a lion's share of what I'm using will just become abandonware.
At one point a few years ago, Spotlight improved enough that I could use it instead of relying on Alfred. So I deleted Alfred, and whaddaya know...a few years later Spotlight got worse and worse, making me regret that move.
I worked at Apple Retail during the Snow Leopard launch. I think I still have a boxed disk somewhere, too. I remember it was not a product I had to sell to customers. People came in asking for it.
Another highlight of that job was selling a green iPod Nano to "John Locke" from LOST
Interesting that you think that of 7.5.3 — it worked, sure, but it could be painfully slow. System 6 was preferable as an OS — MultiFinder was better than 7, at least in the first couple iterations — but much of the software I needed demanded 7. 7.6.x was the first bright spot since 7.1 fixed much of what went wrong in 7.0, & there was a ton of waiting after that. 9 just chugged along for me, for the most part, which was nice.
Loved Snow Leopard too, & was shocked by how bad Lion was in comparison. Glad they got back on track after that.
You were right I forgot about 7.6.1. I think I had a an WiredInc mpeg video card server based on System 7.5.3 for a project. So it had a particular memory burn. I suppose I need up using System 9 since all life forms were supported by carbon.
Just noticed that I was sharing my entire Safari, Spotlight and Siri search history in that menu. Why is that setting in Spotlight settings and not under Privacy/Analytics?
I've found that kind of thing is often caused by a damaged preferences file. The easy way to check that is to make another user account, and see if it happens there too.
> As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
Similar for me but started in system 7.
It’s lucky for Apple that Windows has got worse faster.
Yeah I stopped using spotlight a few years ago. I didn't really notice that I stopped using it until recently. It just became useless. I reorganised my stuff carefully so I know where I put it. I think that turned out to be more powerful than hoping a search engine over the top would be able to sift through the nuances.
... I'm not sure I've ever found any GUI system search reliable enough that I've used it on-purpose (though accidentally, sometimes) on Windows, Linux, or Mac. I always just use "find" and "grep" (and ripgrep when I remember that exists and realize I just grepped a lot and will be waiting for like an hour if I don't re-run the command with rg instead). Or nothing on Windows, which is fine because I haven't used Windows for anything but launching video games in about two decades.
I've only started using gui search since using fedora. the tracker search in the activities view is fast and finds files in the home folder by name pretty well. The only shame is that the pdf-content search doesn't work in the main search interface but only when searching in the file manager.
Windows 11 LTSC one is quite good because it's so damn stupid. You can indeed hit start then just type what you want. Only does files though, by name, which is fine.
Windows 11 is beyond pale. It’s infuriatingly bad. But it could be a benefit if you do a bit of manual organizing and ignore most of its dumb features. Only use it for work, I will never use it at home.
Spotlight was bad back in the day, so I installed Alfred and started using that. Then Spotlight suddenly improved a lot, enough that it was usable for me, and I deleted Alfred. Then about five years ago something happened internally at Apple to the Spotlight team and it just got worse and worse and more difficult to use, making me regret deleting Alfred.
I wish Apple would just fix Spotlight. They don't seem to think it's worth fixing.
That is a good question. I like my dock uncluttered. I have it placed vertically on the left side, with only the apps I use every single day: Alacritty, Brave, Cursor, and Zoom. With Finder and Launchpad included, that's only six docked apps. Everything else I use Spotlight to open, so I feel the pain when the usability gets degraded or buggy.
Does not work. I happen to know a fair bit about mdutil and the like and confirmed that does exactly nothing for my particular issue. A full Spotlight index reset works temporarily, but after a while it just conks out again.
Also, I vaguely remember there being a way to _order_ results, not just disable them.
I have a vaguely related, kind of interesting story related to search indexes, but on windows instead of mac.
My C drive was was super full for some reason I couldn't understand, and Explorer couldn't tell me where the data was. There was about 100GB just unaccounted for.
I do love Gnome. If only we had hardware to run it :/.
I'm stuck on a MBP because it's the only laptop with a great screen, speakers, and battery life. Meanwhile my keyboard keys keep getting stuck after a year of usage, and OSX is garbage. Soon as there is similar hardware I can load Linux on, I'll be insta-switching.
AMD AI Max 395 (superb name) proved that x86 can get to Apple silicon perf. (even not that far in power efficiency), but there seems to be 0 devices from non-trash brands (I am not buying ASUS or HP).
I would love to finally get out of Apple ecosystem, I just don't have any decent alternatives right now. Hopefully next year.
I'm going to purchase a framework just because I value repairability. And honestly, before the m1 macbook I was using a t480s, and I'm okay with compromising on hardware, esp. having been burned with the 2016 butterfly macbook. Apart from the haptic touchpad I wouldn't miss much, other makers are finally ditching low resolution 16:9 screens and you can even find nice oleds. I'm mostly missing the polished software that's only available on macos (things like carbon copy cloner or pixelmator). But with my m1 having degraded battery and having to send it off for a week or two to the nearest service center just to get a new battery, the prospect of a repairable laptop like framework where I can just order a new battery and replace it myself is looking all the more enticing.
I personally think that it is reasonable to "want" an Apple notebook. They have great hardware, great battery life and an ecosystem where every device integrates. Only on macOS you can nicely develop software for iOS. Furthermore most vendors release software for macOS, while they don't for Linux (not only Adobe). BTW the apps I miss most on Linux is the Preview App and Apple Mail.
However I'm done with Apple. I think it's a decision - not "reasoning". That decision takes time and is painful. It's also a decision specifically against "the best" ecosystem available in favor of something "ok".
Not only they repeatedly disappointed my expectations - they just suck as a company (in my opinion). It's not about being less innovative for decreasing software quality, they have done so much for the market, that I think GNOME wouldn't even exist as it is without them... Its about sealing off every inch of their software and hardware they can. No repair without paying... Making RAM and SSD upgrades ridiculously expensive, you cannot even put default NVMe drives into a mac mini - everything is proprietary. Even their sensors have serial numbers to prevent hibernating if you change them out without "hacking" the firmware.
Hardware-wise I have high hopes for framework working with AMD - although they did not address the issues I'd suggest (speakers, lpcamm2), they're constantly improving without breaking their promises. This is hopefully not going to change when they get bigger.
OS-wise I'll stay on Linux. After a long journey going from Ubuntu to Debian to Fedora using GNOME, KDE and even NixOS with Hyprland for a short period, I gained enough knowledge required to really enjoy Linux. System76 is working on COSMIC, which could be pretty amazing, once it is released.
In case anyone would like to try my current Linux config, I'm constantly working on an "install everything" script (pretty early stage):
Yeah... probably. I forgot to mention that Apple computers are a pretty good deal if you are looking for an AI / LLM experimentation machine due to unified RAM which nearly translates 1:1 into VRAM.
What is wrong with an AMD Ryzen 9 with 16 physical cores? If you need more and you have a virtually unlimited budget, then Ryzen Threadripper is even better. Also: Is Asahi Linux an option for you?
Indeed. I complained that Apple design gets a free pass while being haunted by Steve from beyond the grave for a decade. Your comments resemble my habits except rusted sway right into cosmic desktop alpha and done.
> As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
And now spotlight defaults to the whole computer even when I start a search within a folder... for items in the folder... Turned to garbage sometime in the last ~18-24 months.
Apple (at least current leadership) is programmatically degrading it's products so people would buy a new one. Who expects anything good from such a team.
I keep being tempted to write same post but named "Does all software work like shit now?", because I swear, this is not just Apple. Software in general feels more bugged as a new norm.
Most websites have an element that won't load on the first try, or a button that sometimes needs to be clicked twice because the first click did nothing.
Amazon shopping app needs two clicks every now and then, because the first one didn't do what it was supposed to do. Since 3+ years ago at least.
Spotify randomly stops syncing play status with its TV app. Been true for at least a year.
HBO app has subtitles for one of my shows out of sync and it has been for more than a year.
Games including AAA titles need few months post-release fixing before they stabilize and stop having things jerk themselves into the sky or something.
My robot vacuum app just hangs up forever once in a while and needs to be killed to work again, takes 10+ seconds after start to begin responding to taps, and it has been like that for over 2 years of owning the device.
Safari has had a bug when opening a new tab and typing "search term" too quickly, it opens URL http://search%20term instead of doing a Google search. 8 years ago I've opened a bug for that which was closed as a duplicate, and just recently experienced this bug again.
It really seems that criteria for "ready for production" is way lower now. If my first job 13+ years ago any QA noticed any of that above, the next version wouldn't be out until it is fixed. Today, if "Refresh" button or restarting the app fixes it, approved, green light, release it.
Something I found annoying at a previous big-tech work, was how the focus on top-level metrics (read, revenue-linked metrics) meant we couldn't fix things.
There were a lot of smart people, very interested in fixing things— not only because engineers tend to like fixing things, but also because we, and everyone around us, were users too.
For example, many things related to text input were broken on the site. Korean was apparently quite unusable. I wanted to fix it. A Korean manager in a core web team wanted to fix it. But we couldn't because the incentive structures dictated we should focus on other things.
It was only after a couple years, and developing a metric that linked text-input work with top-level (read, revenue-linked) metrics, that we were able to work on fixing these issues.
I find a lot of value in the effort to make incentives objective, but at a company that was already worth half a trillion dollars at the time, I just always felt there could be more room for caring about users and the product beyond the effects on the bottom-line.
This is exactly the problem. Hyper efficient (or at least trying to be) businesses have no room for craftsmanship. If you take the time to make quality software, you’ll be left behind by someone who doesn’t. Unfortunately the market doesn’t care, and therefore efficient businesses don’t either.
The only solution I know of is to have a business that’s small enough and controlled by internal forces (e.g. a founder who cares) to pay attention to craftsmanship.
You're implying that buggy software has no impact on the bottom line. I'm not so sure. Users weigh the availability of features against the quality of features. Getting bugs fixed is not necessarly the highest priority for users either. It's a trade-off.
Our use of Microsoft 365 is a pretty good example of that. I moved our company to Microsoft 365 because it had some features we wanted. Then I moved the company off Microsoft 365 because it turned out to be too buggy to be useful.
I realise that the actual users of software are not necessarily the same people making the purchasing decisions. But if productivity suffers and support costs rise then the consequences of choosing low quality software eventually filters through to purchasing decisions.
Even if buggy software has an impact on the buttom line, managers can continue pretending it doesn't and not allocate any budget to fix them. They assume bug fixes somehow will be squeezed in between the work they really value - new features or better completely new projects. Because creating something new (asking debelopers to create) is the easiest way for a manager to get a promotion. It was many years ago when I last seen a manager (with the power to set priorties and not just translate them form above) who pays more than a lip service to quality and cares about maintenance.
> You're implying that buggy software has no impact on the bottom line. I'm not so sure. Users weigh the availability of features against the quality of features.
The problem is that managers / those that determine priorities don't get the numbers, they don't see a measurable impact of buggy software. There's only two signals for that, one is error reporters - which depend on an error being generated, that is, software bug - and the other is user reporting, but only a small fraction of users will actually bother to make reports.
I think this is a benefit of open source software, as developers are more likely to provide feedback. But even then you have some software packages that are so complex and convoluted that bugs emerge as combinations of many different factors (I'm thinking of VS Code with its plugins as an example) that the bug report itself is a huge effort.
>The problem is that managers / those that determine priorities don't get the numbers, they don't see a measurable impact of buggy software.
I don't believe that. IT departments have to support users. Users complain and request support. It costs money and it affects productivity and everybody knows it.
But that's not enough. You would also have to believe that there are significantly less buggy alternatives and that the difference justifies the cost of switching. For big companies that is an incredibly high bar.
But small companies do dump software providers like my company dumped Microsoft.
[Edit] Ah, I think I misunderstood. You're looking at it from the software provider's perspctive rather than the user organisation. Got it.
> You're implying that buggy software has no impact on the bottom line. I'm not so sure.
The problem is that very little competition exists for computer operating systems. Apple, Google, and Microsoft collectively control nearly all of the consumer OS market share on both desktop and mobile. Thus, macOS just needs to be "better than Windows", and iOS just needs to be "better than Android".
> Then I moved the company off Microsoft 365 because it turned out to be too buggy to be useful.
What did you move to?
In general, Microsoft 365 is extremely successful, despite any bugs. There doesn't appear to be any imminent danger of financial failure.
Software vendors also face tradeoffs, engineering hours spent on fixing bugs vs. writing new features. From a bean counter's perspective, they can often live with the bugs.
> You're implying that buggy software has no impact on the bottom line.
I'm not implying that, and I don't think my manager was implying that either. I think rather there were 2 things going on:
1. It's often hard to connect bug-fixing to metrics.
A specific feature change can easily be linked with an increase in sales, or an increase in usage. It's much harder to measure the impact of a bugfix. How can you measure how many people are _not_ churning thanks to a change you pushed? How can you claim an increase in sales is due to a bugfix?
In your case, I'm sure some team at Microsoft has a dashboard that was updated the minute you used one of these features you bought Microsoft 365 for. How could you build something similar for a bugfix?
Bugfixes don't tend make the line go up quickly. If they make the line go up it often is a slow increase of regained users that's hard to attribute to the bugfixes alone. Usually you're trying to measure not an increase, but a "not decrease", which if possible is tricky at best. The impact is intuitively clear to anyone who uses the software, but hard to measure in a graph.
2. A ruthless prioritization of the most clearly impactful work.
I wouldn't have minded working on something less-clearly measurable which I nonetheless thought was important. But my manager does care though because their performance is an aggregate of all those measurable things the team has worked on. And their manager cares, and so on and so forth.
So at the end of the day, in broad strokes, unless the very top (which tends to be much more disconnected from triage and edge-cases) "doesn't mind" spending time on less measurable things like bugfixing, said bugfixing will be incentivized against.
I think we all know this impacts the bottom-line. Everyone knows people prefer to use software that is not buggy. But a combination of "knowing is not enough, you have to show it" and "don't work on what you know, you have to prioritize work on what is shown", makes for active disincentivizing of bug-fixing work.
such QA jobs no longer exists. Ever since the software dev world has moved to doing one's own QA during development, software has been consistently worse in quality. May be there's a correlation there!
The problem is Agile. Not the way it was intended at some point, but the way it has become through Agile consultants and SAFe. Also the fact that it's become the default for any project and that Waterfall has become a bad word.
Companies abuse Agile so they don't have to plan or think about stuff anymore. In the past decade, I haven't worked in (or seen) a single team that had had more than 2 weeks of work prepared and designed. This leads to something build 4 weeks ago needing a massive refactor, because we only just realized we would be building something conflicting.
That refactor never happens though, because it takes too much time, so we just find a way to slap the new feature on top of the old one. That then leads to a spaghetti mess and every small change introduces a ton of (un)expected issues.
Sometimes I wish we could just think about stuff for a couple of months with a team of designers before actually starting a multi-year project.
Of course, this way of working is great when you don't know what you'll be building, in an innovative start-up that might pivot 8 times before finding product-market fit. But that's not what many of us in big corp and gov are doing, yet we're using the same process.
This, 100%. Agile (properly done, for whatever value of “proper“ you choose) is fine for websites, apps, consumer facing stuff. For things that must work, in predictable fashion, for years, it’s often inappropriate.
OS work is somewhere in between, but definitely more towards the latter category.
The underlying cause of this is online software updates. Knowing you can fix bugs any time removes the release date as _the_ deadline for fixing all egregious bugs. And so the backlog of bugs keeps growing.
Depends where you look. There's been a QA process in all the (agile, some very forward-thinking) teams I've worked with for the last decade. That QA might be being done by other devs, but it's always been there.
You’re not wrong. I’ve assumed it’s a side effect of the way the industry deals with career advancement. If you’re an engineer or middle manager, you aren’t going to get a promotion or bonus if you say “we took feature X and made it more stable without introducing any new functionality”. The industry seems to favor adding new features regardless of quality so the teams that do it can stand out and make it look like they’re innovating. This isn’t how it has to be: if companies would recognize that better doesn’t necessarily mean “more stuff” or “change”, then people could get rewarded for improving quality of what already exists.
I think the financial cost of these bugs is pretty low and the cost to employ people to fix all of them is pretty high. Everywhere I’ve worked, there is a huge backlog of known issues that are agreed upon that we probably just won’t ever get to them. And we certainly aren’t going to hire new people to solve them. It’s probably because the systems we build are getting way overcomplex due to feature piling and promotion seeking complex projects to show off. If these bugs were trivial to solve, they wouldn’t exist. The fact is, these are pernicious bugs because of how complicated everything is.
I actually got penalized in my last performance review because something I shipped “wasn’t that technically complicated”. I was flabbergasted because I consider it my job to make things simpler, not harder to reason about. But you don’t get promotions for simple.
I remember software working really badly in the early 2000s, when Microsoft had an unassailable monopoly over everything. Then there were a bunch of changes: Windows started getting better with Windows 7, Firefox and then Chrome started being usable instead of IE, and Google and Apple products were generally a huge breath of fresh air.
Since then, Google and Apple products have become just as bad as Microsoft's. I think this is because the industry has moved towards an oligopoly where no one is really challenging the big players anymore, just like Microsoft in the late 1990s. The big companies compete with each other, but in oblique ways that go after revenue not users.
Few things manage to make me as angry as a link (even if shown in form of a button) which does not open in a new background tab when clicked with the MMB.
Preloading selected results in background tabs and then closing the main tab, so that I can iterate through the results of each clicked item per tab is simply so much more efficient than entering a page, hitting back, entering the next, hitting back, ...
With regards to Google Flights, I seem to recall that there was some European Digital Markets Act occurrence. Google decided to comply with it in a malicious fashion.
It's hard to argue that systemd isn't a part of modern Linux robustness! It's not the only way it could have been done, but the more declarative model is absolutely better than shell script exit codes. Daemons don't have to worry about double-fork. User-level services are incredibly valuable.
>Safari has had a bug when opening a new tab and typing "search term" too quickly, it opens URL http://search%20term instead of doing a Google search. 8 years ago I've opened a bug for that which was closed as a duplicate, and just recently experienced this bug again.
While webkit might have some much needed improvements in the past few years, it is still the behind Blink and Gecko. Safari, the browser itself. Has been awful for the past 10 years. At least on Desktop. And some of these are not issue with Webkit because other webkit browser does it better.
The Address bar is far the worst compared to Chrome ( OmniBar ) and Firefox ( I believe it used to be call Awesomebar ). I have experience the same bug you mentioned and I believe I filed it way earlier.
Opening Bookmarks with too many items continue to pause and jank for 11 years now.
Tab Overview continue to re-render all the tabs. Causing Paging and Kernel_Task CPU spike. My Kernel_Task is currently 80TB at 240 days uptime. That is 333GB of write per day. Simply killing the SSD.
And no Tab Sleeping.
Apple just doesn't give a fuck any more about their software.
My gripe is that iCloud Tabs haven’t worked right for years. Everything else that syncs in Safari works perfectly fine: tab groups, bookmarks, reading list. But iCloud Tabs, the feature that shows what you have open on other devices, is always either empty or showing things I had open literally months ago.
It works for me. But randomly not work. And I have seen that iCloud Tabs issue before. I think it was logging out and logging back in would fix it. But this will cause another issue but I cant remember what it was.
Basically the whole thing with Sync is very fickle.
On another note, Safari somehow doesn't work well when you have over 128 Tabs.
Every once in a while I think „There is no public bugtracker for closed source software — wouldn’t it be great to have something like Github issues, but for all the software that is developed behind closed doors?“
Like, at least we had a central place to vent about the exact same stuff you just listed, and who knows, in the best case, at least some companies might feel shamed into picking up issues with the most upvotes or see it as a chance to engage with their userbase more directly.
Or I‘m naïve and the most likely outcome is getting sued?
I think the risk is that unless people think that reporting a bug there might actually cause it to be fixed, few will bother to report bugs and you'll end up with mostly people just venting, thus perpetuating the cycle.
> "Does all software work like shit now?", because I swear, this is not just Apple. Software in general feels more bugged as a new norm.
I think this is just the result of an optimizing game placing profit above all else (including quality and user satisfaction) which is indeed the norm in this late stage of capitalism. You want to opt out of that? Good thing the GPL opened the way placing human freedoms front and center, and not-for-profit software stacks like KDE (for instance) keep getting better and better over time.
I use commercial OSes at work by obligation, and the turning point from which my experience as a user became better served by free software happened many years ago.
Don't get me started on Google Home. It was working good-ish for years. Lately it started to respond with "sorry, I didn't understand" no matter what I asked, happily doing it the 2nd time I asked. It became unreliable which is ironic because I can build this tool by myself now in a 24h hackathon using basic openai/anthropic apis..
Maybe we should introduce Mean Time Between Annoyance (MTBA).
Many of my appliances (dish washer, coffee maker, …) work just fine for weeks before an annoyance pops up („deep clean“, for example). Many of my applications do not. For most I could measure MBTA in minutes. Definitely with Spotlight.
I mean, just to consider TV alone (thankfully I do not use one), it takes a while for it to start up, and we are talking about a modern, new TV. Old TVs started immediately. I told my grandma to press the button and wait a bit, before trying to press the button again.
Am I the only one only who is satisfied with Mac OS X? I use Windows from time to time and as far as I can tell it is much worse when it comes to random updates and UI quirkiness.
Mac OS X is fine, that would be snow leopard for example =)
macOS on the other hand, is getting worse, I can definitely concur that spotlight is getting more and more useless. Time Machine as well. It mostly doesn’t work for me, always breaking, hanging…
You can be happy until you're hitting a bug that severely impedes your workflow.
And then you might feel annoyed when they refuse to fix it for years, and there's no recourse because it's closed software.
Generally I am pretty happy with macOS and I still believe it to be the best option for a desktop. Where I'm getting frustrated is the increase locked down nature of the OS. I get that it's for security, and that's fine for my dad, but it's starting to get in the way of me down my work.
So when you already start feeling like the operation system is preventing you from doing the things you need to do, then all the small cosmetic flaws seems more in your face.
I'm done with macOS, I've migrated to Linux for my general purpose computing. With every new release of macOS, Gatekeeper is becoming harder and harder to bypass, increasing Apple's control over what software can be run on macOS, forcing apps to be signed with an Apple Developer ID. While I'm happy they are taking security seriously, I'm seriously creeped out that macOS sends hashes of every executable I run to their cloud. It's starting to feel like a broader move away from the openness of personal computing and towards a more controlled, appliance-like software experience.
When Sequoia eliminated the ability to override Gatekeeper by control-clicking, it became clear to me that Apple is now employing a frog boiling strategy towards their ultimate goal -- more control of the software you can run on their hardware.
My group makes a custom executable to reflash a hardware device we produce. We build it for Linux and Darwin.
Trying to get the program to work with our Mac users has become harder and harder. These are all internal developers.
Enabling developer mode and allowing Terminal execution isn't enough. Disabling the quarantine bit works - sometimes - but now we're getting automated nastygrams from corporate IT threatening to kick the laptops off the network. I'm exhausted. The emergency workaround, which I tell nobody about, is way less secure than if they just let us run our own software on our own computer.
I once really urgently needed `nmap` to do some production debugging ASAP. Unfortunately, the security tools would flag this immediately on my machine, as I knew this from previous experiments. Solution - compile my own binary from sources, then quickly rename it. I assume that this "workaround" was totally fine for sec department. At least production got fixed and money kept flowing.
> At least production got fixed and money kept flowing.
You were denied the tools to get your job done. You've put yourself at risk by applying an unapproved workaround.
Never ever do this (unless you hold substantial shares). Let the company's bottom line take the fall. If that's the only thing they care about, that's your only way to make the problem visible.
Unfortunately the real world isn't black and white. Yes, according to the company policies, I should watch the world burn and do nothing, while looking at the company bleeding money due to customers SLA being broken. Of course, after submitting a ticket to get nmap approved, which takes days. Extra points if I'm on oncall, then racking that sweet incident money is great.
But the underlying SRE culture here is that, if you know what you are doing and have a functioning brain of a responsible person, you'd be forgiven a jump over the fence, if it means putting out a fire on the other side of it. We aren't kids.
There’s a middle ground. Get the appropriate stakeholders involved in the decision, including security. Let security be the ones to keep the system down, if it cones to that. Or, let the business operations folks make the decision to go over security’s head. Either way, this is not something an engineer tasked with fixing an outage should be making the decision on.
Engineers _should_ have leeway in how they resolve issues. As I read, though, you have a company policy which explicitly disallows the action you needed to take to fix the problem (if I misread, my apologies). Getting the stakeholders involved is the responsible thing to do when policies need to be broken.
Ideally, the way this kind of situation gets handled should be documented as part of a break-glass policy, so there’s no ambiguity. If that’s not the case, though, the business should get to decide, alongside the policy maker (e.g.: security), whether that policy should be broken as part of an emergency fix, and how to remediate the policy drift after the crisis.
If you’re all tight enough that you’re allowed to make these kinds of decisions in the heat of the moment, that’s great, but it should be agreed upon, and documented, beforehand.
Well I found out the hard way that company culture or values can mean nothing if you don't CYA. Granted, the shop was small enough that our team was in charge of both the security policies and ops, but still, on one unfortunate occasion I've stepped outside my area of responsibility to "do what's right" and got punished. The next time I've been in a similar situation - well, I've walked away from the fire and grabbed the popcorn.
By the way, I'm still burnt out. This work is stressful. Don't let it take away what's already scarce for you.
xattr -cr <file> should clear the "download" extended attribute and make it as if the software was compiled on the machine itself, bypassing the ever-so-annoying Gatekeeper.
For binary patching: codesign --force --deep -s - <file> (no developer ID required, "ad-hoc signing" is just updating a few hashes here and there). Note that you should otherwise not use codesign as it is the job of the linker to do it.
Very aware of the attributes, unfortunately these machines are on a global corporate network so there are layers and layers of monitoring software to prevent internal and external attacks. Changing perm bits on an OSX executable is instantly noted and sent upwards as a possible security breach.
Last time we did this I had to spend a week explaining to management that Macs could actually run software other than PowerPoint and it was necessary for our job.
The local workaround that we use is to just spin up a Linux VM and program devices from there. The less legal workaround is using WebUSB and I'm afraid to even tell the necessary people how I did it, because it's sitting out on a public-facing server.
...and there's an apple developer support person, Quinn, who appears to be heavily if not solely dedicated to helping developers do binary signing/notarization/stapling correctly.
Quinn also has their email address in their sig so people can just reach out via email without even needing an Apple account, or if they prefer more confidentiality.
As someone who actually signs, notorizes and distributes desktop apps for macOS, I can safely say their documentation is less than ideal.
Maybe because I'm using Electron framework which makes things more complicated, but I don't really understand why there's is a difference between different types of certificates (Developer ID, Apple distribution, macOS distribution) and I had to guess which one to use everytime I set it up.
Also why is notorization a completely different process from code signing, and requires completely different set of credentials from it. Seems odd to me.
> Also why is notorization a completely different process from code signing
Because they do completely different things. Signing is a proof that you were the one to write and package that software; notarisation is an online security check for malware. If I recall, you still sign but do not notarise when distributing to the Mac App Store.
OMG, this. I was working on a tool to help integrate password managers on macOS and I got completely blocked by the notarizing requirements. Some things literally cannot be built for macOS as open source software, now.
I don't really think saying documentation exists says much when Apple is notorious for having documentation that's either borderline or downright useless. It's generally the norm that some random blog post from a decade ago is more useful than their documentation, and I say this from firsthand experience.
Can you sign and notarize your own software made for internal use with your own infrastructure? If so, then this is a valid response. If not, then this is an irrelevant response because the issue is going through Apple, not the process being difficult or undocumented. If I own the device, then I should be free to decide what the sources of authority over it are.
Edit: I haven't tested it yet, but it does seem that you can sign an executable with your own certificate (self-signed or internal CA-issued) however you can't notarize it. Right now, notarization is only required for certain kinds of Apple-issued developer certificates, but that may change in the future.
Anecdotally, I was not able to find any way to notarize software for internal use, without paying for a $99 developer account. Though I would have been willing to pay, I know that others who might want to build the software wouldn’t, so I abandoned my project. I suppose I could have maintained it as open source with the developer account required to build, but it seemed disingenuous to me at the time.
> I mean, come on.
Is that really necessary? Obviously there are enough people who did not know about, or find helpful, the resources you’re referring to, that we have people complaining on Hacker News. This isn’t exactly a novice’s forum. Perhaps the problem lies with visibility and accessibility of the support resources, rather than all of the people who have seen notarization as a hurdle to getting real work done.
btw, for those who don’t want to search, Quinn’s signature states:
“
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"
I understand that you're doing it on principle, but for a software development team, 99$/year is a really minuscule price to pay to be able to build / notarise / distribute software.
Developers pay exorbitant amount of money for much lesser value, and the idea of putting your teammates at risk to stick it to apple is kind of sad bordering with negligence from a business POV.
The principle is what matters. The amount is not the issue. The issue is that there is a cost at all. "It's so cheap" is never an excuse for charging for something that should be free. In this case, running software you have no intent to charge for, on your computer. It's as if someone started charging $0.01/month for breathable air. "But $0.01 is trivial," would not excuse it.
It costs money, and isn't free, for a reason you're not acknowledging. I don't think it's a major profit center for Apple.
It's about setting a higher floor for malicious actors than "random botnet residential IP + a captcha solving service". It's about proving some semblance of identity through a card number and a transaction that goes through without a chargeback.
As the case upthread shows, there's plenty to dislike about a system that inhibits running code built for personal use. And it's obviously neither foolproof nor without collateral damage. Reasonable people can debate if it's worth it. But it still ought be acknowledged that the motivations are closer to the reason you have to identify yourself and pay a nominal fee to drive a vehicle on public roads.
I don't buy it. Or rather, I am willing to believe that some team at Apple has convinced itself that this makes sense, but they're wrong.
In particular, the security boundaries are nonsensical. The whole model of "notarization" is that the developer of some software has convinced Apple that the software as a whole (not a specific running instance) is worthy of doing a specific thing to the system as a whole.
But this is almost useless. Should Facebook be allowed to do various things that can violate privacy and steal data? What if the app has a valid reason to sometimes do those things?
Or, more egregiously, consider something like VSCode. I run it, and the fancy Apple sandbox helpfully asks me if I want to grant access to "Documents." The answer is really "no! -- I want to grant access to the specific folders that I want this workspace to access", but MacOS isn't even close to being able to understand that. So instead, one needs to grant permission, at which point, the user is completely pwned, as VSCode is wildly insecure.
So no, I really don't believe that MacOS's security model makes its users meaningfully more secure. At best, the code signing scheme has some value for attribution after an attack occurs, but most attacks seem to involve stolen credentials, and I bet a bunch just hijack validly-notarized-but-insecure software a la the VSCode example.
Notarization is not a trusted system on macOS - or rather, notarized binaries still have a "this was downloaded from the internet" prompt, and the user is meant to make a decision on whether it is trustworthy.
Notarization does some minimal checks, but is mostly about attaching a real identity so that maliciousness has at least some real-world consequences. The most obvious being that you lose the ability to get more apps notarized.
Actually the cost is not the issue (you are paying for it one way or the other), the issue is the authorization to do such an action on your (supposedly) own hardware.
Adding signing as a requirement can easily make what was once a very simple distribution mechanism into something much more complex - now you need manage signing certificates and keys to be able to build your thing.
In contrast to this point, as long as I use Xcode and do the same thing I've always done allowing it to manage provisioning and everything else, I don't have a problem. However, I want to use CI/CD. Have you seen what kind of access you have to give fastlane? It's pretty wild. And even after giving it the keys to the kingdom, it still didn't work. Integrating apple code signing with CI/CD is really hard, full of very strange error messages and incantations to make it "work".
I don't know about fastlane, since my CI/CD is just a shell script, and signing and notarising is as hard as (checking the script) running `codesign ...` followed by `notarytool submit ... --wait`
Yes, you need to put keys on the build server for the "Developer ID Application" (which is what you need to distribute apps outside of AppStore) signature to work.
You do not need to give any special access to anything else beyond that.
Anyway, it is indeed more difficult than cross-build for Darwin from linux and call it a day.
Do you distribute OSS software which requires notarizing? If so, have you found a way to let the community build the software without a paid developer account? I would be very interested in a solution which allows OSS development, relying on protected APIs without requiring that anyone who builds the app to have a paid developer account.
You seem to be comparing a single dev sending apps to the world vs a corporate team pushing to employees (if I get parent's case right).
In most cases, just involving account management makes the corporate case 10x more of a PITA. Doing things in a corporate environment is a different game altogether.
Code signing is absolutely disgusting practically and philosophically. It has very reasonable and good intent behind it, but the practical implementations cause great suffering and sadness both for developers (cert management, cost, tools) and end-users (freedom of computing).
The tool is built deep in our CI/CD chain. The whole thing is a house of cards built on a massive pile of tinder next to an open drum of kerosene. You want me to integrate XCode into that?
Last time I tried setting up an Apple developer license inside a large corporation, one that they paid for and not tied to me or my credit card, it was also a nightmare.
Who said anything about Xcode? The codesign tool is part of macOS, not Xcode. The CLI tool for notarization is bundled with Xcode, but you don't have to use it; they have an official REST API that you can use directly.
Sure it's trivial, but it is tacit acceptance that you need permission to make a program on their platform. Permission that needs to be renewed year over year. Permission to earn a living on this platform.
Permission that can be revoked for any reason, including being compelled by someone with more power than Apple.
I migrated to Linux about a year ago too. Not the smoothest experience ever (looking at you, ath11k with device-specific quirks) but so far I am delighted. Finally, I don't have to fight my computer to do things I expect it to do.
Unfortunately, I still have to deal with macOS for work due to corporate policies.
The main problem I had with living in a Gnome desktop environment, is with the keyboard. I'm not willing to abandon my use of Emacs control+meta sequences for cursor and editing movements everywhere in the GUI. On macOS, this works because the command (super/Win on Linux/Windows) key is used for common shortcuts and the control key is free for editing shortcuts.
I spent a day or so hacking around with kanata[0], which is a kernel level keyboard remapping tool, that lets you define keyboard mapping layers in a similar way you might with QMK firmware. When I press the 'super/win/cmd' it activates a layer which maps certain sequences to their control equivalents, so I can create tabs, close windows, copy and paste (and many more) like my macOS muscle memory wants to do. Other super key sequences (like Super-L for lock desktop or Super-Tab for window cycling) are unchanged. Furthermore, when I hit the control or meta/alt/option key, it activates a layer where Emacs editing keys are emulated using the Gnome equivalents. For example, C-a and C-e are mapped to home/end, etc.
The only problem is, this is not the behavior I want in terminals or in GNU/Emacs itself. So I installed a Gnome shell extension[1] that exports information about the active window state to a DBUS endpoint. That let me write a small python daemon (managed by a systemd user service) which wakes up whenever the active window changes. Based on this info, I send a message to the TCP server that kanata (also managed by a systemd user service) provides for remote control to switch to the appropriate layer.
After doing this, and tweaking my Gnome setup for another day or so, I am just as comfortable on my Linux machine as I was on my Mac. My main applications are Emacs, Firefox, Mattermost, Slack, ChatGPT, Discord, Kitty, and Steam. My Linux box was previously my Windows gaming box (don't get me started about frog boiling on Windows) and I'm amazed that I can play all my favorite titles (Manor Lords, Hell Let Loose, Foundation, Arma Reforager) on Linux with Proton.
Love this, and I'm in the same boat. Is your configuration of kanata public at all?
I know it's mostly muscle memory, but macOS shortcuts just seem sane and consistent and that has been one of the biggest frustrations when trying to switch. I found toshy[0] which does something similar - did you try that? The goal is purely macOS key remappings in Linux, so a much smaller scope than kanata.
I didn't try toshy, I had a bad experience when I tried kinto.sh a couple of years back, and I had a pretty clear idea of how I could get what I wanted out of a fully featured keyboard remapping tool under Linux. I initially started with Kmonad, but once I found Kanata, and realized that it had a TCP interface for programmatically changing layers, I quickly switched.
I have a Kinesis 360 keyboard, and my config[0] probably won't work for other keyboards, but it can give you a starting point for your own config.
I'm convinced a DE that figures this shit out out of the box will explode in popularity. Super for the OS and DE shortcuts. Ctrl for the Terminal and readline cursor movements. It can't be impossible to bake these in as defaults.
The hashes are completely anonymized and not that intrusive. I'd rather they do it that way and have a global view of possible malware attacks than the complete free-for-all that other platforms "enjoy".
But here's my (unpopular) take as a GNOME user and using Fedora immutable distros + flatpaks -- I suspect Linux is going to go in a broadly similar direction. Maybe not soon (even flatpaks aren't universally acclaimed), but sometime.
It doesn't matter whether it is anonymized. Apple has no business collecting information about what executables I am running on my own computer, or even whether I'm running executables at all. I don't care what their stated purpose is. I don't care what they want a "global view" of. It's my computer, not theirs.
I don't even mind that they've introduced a level on the totem pole that's above root. But on my computer, -I- should be the one at that level, not Apple.
to downvoters: you can think its not fair that apple effectively holds control of your device, true, but the only way you can change things is to not buy the products. If you buy it, you accept how it is. vote with your wallets, not in some internet forum
I think it depends on what distro you're talking about. Corporate distros like RHEL and SLES are absolutely going that way. It takes a lot of effort to backport fixes, and the money's not there in desktop Linux to make it worth their while if containerization is a viable alternative. Red Hat's gotten rid of a bunch of graphical applications for RHEL 10 and stated that users can get them from Flathub as an alternative. I believe there was some consternation when CentOS Stream 10 launched without even a packaged web browser and the advice was to install Firefox from Flathub (there's a lot of use cases where that breaks stuff), but it appears they've walked that back and started providing Firefox as a traditional package.
However, less corporate distros that mostly just ship built upstream software as-is since they don't have to support it for long periods (think Arch, Fedora, Void, etc) don't have that problem, so I expect we'll continue seeing them use traditional packages.
> I believe there was some consternation when CentOS Stream 10 launched without even a packaged web browser and the advice was to install Firefox from Flathub
Ubuntu does the exact same thing with their snap repository, the Firefox apt package from Ubuntu is fake. At least Flatpak is a community-led project unlike snap.
You can limit the file system permissions of the app, like giving only access to downloads, so that if/when there’s a sandbox leak you’re fine. You can also disable various things, like webcam or mic, this way.
In addition, you can get perpetual updates to the latest version of your browser even on old, stable distros like Debian.
Running a new browser on an old distro would be a strong reason for me (if I somehow couldn't update the distro - but I can and I do.)
Regarding security, the added work and complication outweighs the added security for me. I can't really disagree with having a different preference. More security on this wild internet is better, right?
IMO it's not much added work. In KDE you can navigate to settings and edit flatpak permissions, and flatpaks are available to download via discover. I haven't noticed any weirdness for firefox or chrome.
I understand and appreciate the sentiment, but I see the intent very differently. Apple is not employing a frog boiling strategy, but rather being responsive to an increasingly sophisticated adversary.
It’s like criticism of the quality of Google search dropping. It has absolutely tanked, but it’s not because the algorithm is worse, it’s because the internet has grown orders of magnitude and most of it uses the same hyper aggressive SEO optimisation, such that the signal to noise ratio is far worse than ever before.
You can also block specific subdomains, too. Useful when I want to be able to see finance.yahoo.com items in my search results, but nothing else from the yahoo.com domain.
That rationalization ignores a lot of confounding evidence, such as other search engines being able to deliver great results and adequately keep the SEO garbage out.
That’s kinda the SEO equivalent of security by obscurity though, right? SEO spam puts a lot less effort into optimizing for other search engines, whereas Google is dealing with being the primary target of every adversarial SEO spam site.
This is a great theory but it isn't the reason. Google management made a conscious decision about five years ago to prioritise profit over search quality.
We know this because the emails came out in discovery for one of the antitrust suits.
The biggest struggle is that the original Macintosh was so simple to manage. The original concept of system extensions to expand the capabilities and the file structure built on the hierarchy with the desktop as the top level was broken with the shift to Unix.
Suddenly the users file hierarchy started wherever the Home folder was located and it became an island of user controlled environment surrounded by complexity of computer operating systems.
The result I found overall well thought out but when the desktop became just a folder I felt the Mac moved from it’s simplicity embracing the complexity that was offered by windows.
Simplicity is fine for a hobby project. An operating system having zero concern for any kind of security is a non-starter today.
It's amazing the rose tinted glasses people have about the original Macintosh environment. It was insanely janky and (unless you were ruthlessly conservative) insanely unstable by today's standards. By version 10.5 (Leopard) the modern UNIX-based MacOS was unequivocally superior to Classic MacOS in every metric other than nostalgia.
I understand the trade offs and accept them. I was trying to point out where the split is and how it won't go back. I think the point of view expressed in your comment is s distorted as the ones your derding.
I also believe that the simplicity could have security as performant. The real advantage of the Unix layer is compatibility that the Macintosh was missing.
I sincerely tried to interpret what you meant here, but I failed. I understand the words, and the fragments of every sentence, but I wasn’t able to deduce the intent of your reply.
Are you trying to say that it’s possible for a system to be both simple and secure? Absolutely that’s the case, but with a trade-off — either it needs to restrict the user’s freedom, or be fully disconnected from the outside world.
I have pondering these ideas a long time and what is needed is an intense glossing over of all the details. The original Macintosh did exactly this and was called a toy and with 128k completely useless. Alternatively my unsophisticated Mom saw the Mac 128k demoed at the mall and went into a frenzy to get that tool. She wanted to publish documents.
The threats in the world are real and the internet doesn't help. I 100% agree that a network connection needs to be kept at a distance to make things simpler.
I think the power of language used to describe a system is where simplicity begins.
What I'm working on is creating a crisp line of delineation between "local" and "public" networks.
If by default after is on the "local" network auto-discovery is secure. If things are explicitly needed a user can publish them through physical manipulation to publish to the outside world.
The outside world can now be described using classic Users and Groups which is cultural easy to understand.
I'm trying to create an environment that focuses on making those 2 things plus a third element simple to understand and physically manipulatable.
The freedom I'm looking for is available on the "local" network. The "public" network is where our data is interchanged with the outside world based on our publishing. I don't expect people to interact with this layer much. I expect people to configure it for whatever institution/organization/government.
Most of the complexity I see in computing these days is market drive demand for eyeballs/clicks/...
Actively depleting the good-will they accumulated over the years definitely makes it worse. It's that harder to give the benefit of the doubt to a company also showing the middle finger to their Devs.
> Google
Giving priority to AdSense sites, fucking around with content lengths (famously penalising short stay sites), killing advanced search options. That's just thinking about it for 10s, but to me most of it is totally of Google's making.
Of course Google's algorithm is worse. Google prioritises showing you search results that make money for Google. Google has no incentive to show you anything else.
I can't believe I even have to say this out loud. Look up enshittification.
Frustrating thing is the earlier versions worked well, it protected you from accidental things but the way to force it was clear and obvious. Now bypass is obtuse and requires enough work arounds people advise just disabling it which is also bad to normalize.
Don't disable SIP, clear the downloaded/quarantine extended attribute instead. This clears all extended attributes: xattr -cr <file> and bypasses the obnoxious GK.
The difference: in Linux it is a usability issue to be fixed, whereas on macOS it is a feature and explicit design goal to make it that way. In general, I have found that things which are difficult on Linux are so because the problem is difficult, not because the people who make my computer have paternalistic attitudes about my usage of it.
> In general, I have found that things which are difficult on Linux are so because the problem is difficult [...].
Hard disagree. Audio mixing is not difficult[1]. The Linux kernel guys were right - it does not belong in the kernel. The userspace story however, has been a complete shitshow for decades. I think Pipewire mostly fixed that? Not sure, sometimes I still have to log out and back in to fix audio.
The funniest part? It's been working in the BSDs all along. I recommend reading the source of sndiod[1].
What's even worse? Probably systemd. I try not to hold a strong opinion - I tolerate it, the way I tolerate traffic when taking a walk. The technical issue however is several orders of magnitude simpler - again, the BSDs come to mind, but you can also write a complete PID1 program in about 20 lines of straightforward C[2]. I don't mind the commands being unfamiliar (they're already all different in almost every OS family); it's that the entire package is dreadfully large in scope, opaque, and I find it more difficult to interact with than anything else in this landscape.
I agree PulseAudio, Pipewire, ALSA, etc. are a pretty big shit show in Linux and have been for some time. From what I understand there are a few stories there with various levels of screw ups, but at no point was this situation the goal, and we are moving closer to an easy to use system that "just works" for these needs.
However, it's worth noting that audio experts doing high grade mixing in production are using these systems quite effectively and have been for a long time. It's similar to Blender in that regard with it always having the "guts" of doing great things, but only the experts that knew the correct spells to cast were able to use it effectively before the UI/UX was improved with 2.x and later I believe.
I work in live media production. I would never consider doing any mixing on Linux - just like I wouldn't consider putting Docker containers on a Mac to serve live HTTP traffic.
There are indeed always exceptions to generalizations, as you've pointed out. Though pulseaudio always trudged along fine for me (not like audio had always worked for me on other systems), and pipewire works perfectly.
So it’s purely ideological without any real world difference?
Are most people better off with Apple defaults?
And it’s not because the problem is “difficult”. It’s because for 20 years it has been claimed that this will be the “year of Linux on the Desktop” and it’s never been good enough for most people.
It’s perfectly fine. KDE and Gnome are both now more cohesive, more intuitive, and less buggy than either Windows or MacOS.
The problem with Linux is that, while it’s very good, it’s different.
Nobody actually cares how intuitive something is, at least not in absolute. People will still say Windows is intuitive. Pretty much nothing in Windows, from the registry to COM to IIS to setting/control panel/computer management, is intuitive. But they know how to use it and are used to that particular brand of buggy inconsistency.
Linux desktops have been high quality for a long time now. The reality is you, and others, measure quality as “how much is it like windows” or “how much of it is like macOS”. When that’s your metric, Linux will always come up short, just by definition.
If I pick up a Linux laptop right now, how well will it handle low latency audio? How well will it handle power management? My graphics hardware? Getting WiFi to work reliably is still an issue.
Can I get a high performance Linux laptop with good battery life, fast graphics, that runs cool and silent?
Yes, I just bought one a few months ago actually. A new lunar lake laptop. It gets 12 hours of battery life and has plenty performance for programming, plus 32 gigs of ram. It’s under 3 pounds and the screen is OLED.
And yes, everything works. On bleeding edge 2 month old hardware.
I even use thunderbolt 4 to connect my external displays and peripherals. Not only does it work, but it’s pleasant. KDE has a settings panel for thunderbolt. I can even change my monitor brightness in KDE settings. No OSD required!
But wait, there’s more! I’m running 2 1440p monitors at 240hz and the system never even hiccups.
But wait, there’s more more! The battery settings are really advanced so I can change the power profile, maximum charge, everything.
The only thing I’m unsure about in your comment is “low latency audio”. It seems low latency to me, but I’m not an audio engineer.
I can certainly get a Framework (Fedora and Ubuntu officially supported), throw my prefered Bluefin-Framework image in and get working
Battery life around 7 hours is the average I see reported, Fast/Silent will depend on the model, but I don't see the issue really
Upgradability and easeness of battery replacement are a plus
I just picked framework because they were first to come to mind, but I think Dell has a nice Linux story, Tuxedo also comes to mind
7 hours battery life is less than half of what I get on my MacBook Air. That wouldn’t last me on my ATL - HNL flight I took last year or my MCO - LHR 10 hour flight I’m taking this year.
These are the typical reviews I see around the Framework
> Getting WiFi to work reliably is still an issue.
This should not be an issue. I have hardware that varies a lot and I literally buy random wifi dongles for $1, $4, $5, Amazon, AliExpress, etc. and they have all just worked on first plugin. I can easily take my phone and tether it to my PC using USB-C and it appears in my Gnome network list and just starts using it for Internet.
> how well will it handle low latency audio
Pretty well you can use OBS to verify this. There are plenty of settings if you want to tune that.
> My graphics hardware?
Just ignore Nvidia and move on. Sure they might figure it out one day, I gave up a decade ago and I use Intel integrated or AMD dedicated for GPUs. Nvidia does "work" for most purposes but it will cause you a headache eventually and those are not worth $400 to me.
> How well will it handle power management?
I enjoy the basic controls that Gnome provides that give me a simple way to basically say "go all out" or "save some battery" etc. There are finer grain controls available and I have used commands in the past to push crappy hardware to it's limits before I chucked it (old Intel iGPUs)
> Can I get a high performance Linux laptop with good battery life, fast graphics, that runs cool and silent?
You can get ones that are specifically marketed for this purpose. Tuxedo is one that specializes in this and obviously System76 also do. These have a higher price point than a regular Dell system, which IMO is the better option in some ways. Dell sells more systems and has more users and it will "just work". They sold Linux systems for years and still do I believe.
Regarding "running silent" this is a gripe I have, not that it runs loud but some laptops have custom RGB crap and sometimes in Linux I don't have access to the extra functionality to customize my lighting or manually set my fans to ramp up etc. There are projects that aim to do this, but I have not looked into them beyond the most basic `fancontrol` built in command.
I think once you expand the scope to "most people" it might become impossible to say what the correct answer for that large of a group is. In the past their value add might have been more compelling and their feature lock not as draconian. It appears some people think that has changed over time.
That isn't the denotation of my post; I was not characterizing Linux as a whole, but only responding to your specific (unsound) analogy. It works better for me, for a number of reasons including that above. Perhaps it will work better for you, as well. :)
The second part of your post is incoherent to me, I can't tell what you're trying to say.
I don’t know if it’s just me, but i want more Gatekeeper, not less - help me stay safer. Or is it a security theatre? Malware producers can sign things just fine?
The more Gatekeeper, the more used people get to clicking OK without considering what it means. No amount of software can prevent the social engineering of an actual malware that tells the user to just click that OK button that they already have to do on a regular basis. Less is more here. It's why Windows tuned down their UAC after Vista.
It is not a consent prompt. You get a choice on whether to trash the binary or quit.
To run a non-motorized app requires you to open a separate app, navigate to the security section and select that you want to authorize the app to run.
Apple does not have any desire to make distribution of non-notarized binaries commercially viable.
And we've seen this change across all browsers. There no longer is a "continue" prompt for TLS issues. The result is, way fewer maintained sites go months with an expired certificate.
Predefined value on current macOS's Gatekeeper is "move to Bin" instead of OK. Other option is Done - which cancels opening action. If you want to bypass that, you need to go to system settings > privacy & security and manually allow particular app there.
> I am not suggesting Apple has fallen behind Windows or Android. Changing a setting on Windows 11 can often involve a journey through three or four different interface designs, artifacts of half-implemented changes dating back to the last century. Whenever I find myself stuck outside of Appleland, I am eager to return “home,” flaws and all.
Hard agree with this. I sometimes have to boot up a windows laptop to play Minecraft with the kiddo, and it never stops reminding me how little I know about Windows now, how counter-intuitive everything is, how everything feels designed for a user whose mind I cannot comprehend.
I was fully braced for Windows 11 being awful when I installed it recently but that hasn't been my experience at all. If anything it's just a slightly more polished Windows 10.
Probably helps that I installed the IoT LTSC version, but still, apart from the task bar being stupidly in the middle (thankfully there's an option to move it to the left), I've had zero issues.
I even added a network printer and it found it quickly, and added it quickly and successfully, which is a feat I don't think I've seen happen on any OS ever.
The context menu is a clear improvement on the old one (which you can still get to with one click).
As someone who stopped using windows about 7 years ago, and only recently used it last weekend, my eyes probably glossed over the fact that some buttons were laid out
horizontally.
Except for when the placement of the icon strip with the trashcan symbol changes to the bottom of the context menu because of the location of the context menu on the screen. Bonkers. No idea why the UI committee would’ve okayed that one.
Also, every time you run something in Windows (whether it's part of the OS or an App) it can be a trip down memory lane, UI-wise. Oooh, this dialog is 2015 vintage! This dialog is styled like Windows 8! This one is from the XP era! Ohh, and that rarified dialog has controls that have not been changed since Windows 95!
If you want a super bad audio-related journey, try fixing external speakers connected to a Linux box. It's abysmal, and 99% of it can only be done via the CLI. Nothing wrong with that... but for something so normal I expected more ease-of-use.
I disagree with that. As an occasional user of MacOS, the new Settings app is quite bewildering. There are just as many dials as in Windows and sometimes requires a trip to ChatGPT.
And for reasons I don't understand, why is the window itself not resizable?
I'm a Windows fan (I actually really like 11) so I'm a bit biased, but I just dove back into macOS since 2014 and the settings app is truly terrible. The built in search barely works and the layout is so damn confusing. God forbid I install some remote desktop software, now I have to go to accessibility settings 5 simes and approve some permission that is strategically buried for only what I can tell as a way to thwart "normies" from enabling something via obfuscation.
It would be fine if the settings available were actually useful or at least could bring me to some tool that does it better. I get no meaningful report of what's eating my batter and why every time I open my MacBook it's dead. And if I want to change the actual resolution of my display I'm given just a list of scaling options pretending to be resolutions. Oh, want to set a specific resolution or refresh rate? You have to do some stupid kinger king foo of option control something _before_ you click on this dialog. I get the criticism about the Windows settings app and legacy power tools (I think this has largely been solved anyway), at least they exist and allow me some iota of control over my computer
It is resizable vertically but not horizontally as it doesn't make sense to resize the window horizontally considering the content of settings details panel (the right part of the settings window), you would end up with a lot of empty space if you were able to resize it horizontally.
You could say the same thing about the Windows Settings app, but it resizes in every way and it's very much size adaptable. In other words, UI components resize or become visible/invisible depending on the width.
Search feels to me like a good compromise between memorizing terminal commands (including the correct set of parameters to do what you want) and navigating through a UI to find what you're looking for.
Search is fine as a one-off thing, but if you repeatedly have to use search to find some common setting, that's a clear UX fail.
To be fair, it's hard to say whether the Settings app is more broken in Windows or macOS these days. I think I'd have to give the crown to macOS here on account of search itself being more broken.
Why is it a UI fail? Honestly search as the default way of going to settings is my favorite development in modern OS design, I no longer need to memorize 3-6 deep menu trees to find a trivial setting.
For example:
I prefer keeping my hands on the keyboard, and typing cmd+space followed by mouse is so much faster than finding the right pixels to click through in menu trees when I want to adjust my mouse sensitivity.
I didn't say that search itself is a UX fail. It's not; it's great!
The UI fail is if search is required to find the setting every time you need it, because categorization and/or navigation is broken otherwise.
As to keeping your hands on the keyboard, that's an argument for having proper keyboard support in any UI, complete with hotkeys and shortcuts. The big difference between these and search is that the former is (if properly done), consistent and predictable. So e.g. when the app adds new things in the future, your existing key sequences will still do the same thing they did before.
To take your specific examples, if I do Cmd+Space, "mouse", Enter on my system, it will bring up LinearMouse, not system mouse settings.
Disagree with this. I use the search for everything. It’s just so much quicker than even a well designed UI.
On my iphone, I have one page of apps, everything else in the app drawer, and use the search all the time. It often gets what I want in one or two chars.
then Mac fails as hard as windows. there’s a reason search exists in the settings app on both MacOS and iOS. and there are plenty of settings that require “default write …” or editing some plist file or worse
I admit I honestly have no idea where the system settings are located as I haven't pressed the start button in ages, but the same applies to MacOS as I would use spotlight there as well.
Using search as a UI means you can only find things that you know exist, but there are plenty of important settings that I've only discovered by actually navigating through the UI.
The point of this comment thread is that important Windows settings are scattered throughout many different interfaces beyond just the Settings app, and you can never be sure where to find what you're looking for, which results in a poor user experience. Off the top of my head, you have the Settings app, Control Panel, Device Manager, System Configuration, and Network and Sharing Center.
I recently discovered that I can change audio settings on a mac by using the opt+volume shortcut and it takes me directly to the sound panel. Now if I could only make it stay on the built-in microphone instead of always switching to the worse sounding airpods one.
> You can access most settings by Windows + "yourquery".
The search doesn't even work all the time. Sometimes it won't do fuzzy search, sometimes typing "bluetooth settings" will do a Bing search, some other time it will open a PDF, and so on.
It's fine if you stay away from the consumer releases. Windows 11 LTSC (based on 24H2) feels like windows 7. Most of the stuff you had to futz with powertoys and GPOs back then. That hasn't changed. I quite like it. It has been utterly boring compared to my recent Apple experiences.
I... think so? Whichever one works with Microsoft Realms, which is the $2/month solution I settled on after somewhat-getting a self hosted server to run for a little bit on my desktop.
I figured that I make a six-figure salary as a software developer, I can afford $2/month so that I don't have to fucking become a sysadmin for a game server my child depends on.
There are two editions, Java and Bedrock. Java is the original, available on PC and Mac, and supports programming-like technical play and mods. Bedrock is Microsoft’s reimplementation, available on all devices except Mac, and supports emotes and microtransactions. Other than that they’re largely the same game, and buying either gives you both versions. Realms supports both, but a server is one or the other, not both. There are also other managed hosting providers for Minecraft (both versions), but Realms is probably easier and cheaper for you. Java version has performance problems, but mostly because Microsoft’s code is inefficient, there are a few mods (also written in Java) that everybody uses to fix performance without affecting gameplay.
Hey, if we're already complaining about Microsoft products, can someone explain why the Bedrock and Java versions of Minecraft have not been made cross-compatible in the TEN YEARS since the Mojang acquisition?
(... speaking as another dad just trying to play with my kid.)
What does cross compatible mean in this context? They are two different games written in two different languages. I mean, they look like they are the same game, but they are not. Making one compatible with the other is a Herculean task. If not impossible.
I'm talking about network compatibility, so that a Bedrock client can join a Java server and vice versa. It's clearly somewhat possible because GeyserMC[1] exists. It's just ridiculous that it's a third-party addon.
I’d imagine mostly due to a lack of incentive on microsoft’s part. Like minecraft is literally the biggest video game to ever exist with, making 2 entirely separate code bases work while keeping all the features the same and preserving compatibility with over a decades worth of mods just so the mostly separate java and bedrock communities can play with each other is just not worth the risk. So many people play minecraft in so many different ways means that making even minor changes in gameplay can be huge sources of controversy, let alone major infrastructure changes.
They still exist separately today because the modding scene is completely different for them. Minecraft Java is the original and has a huge modding community based on decompiling and patching the game. Those mods are all incompatible with Bedrock because Bedrock is a separate reimplementation of the game for performance or whatever.
Every article about some issue with Apple MUST also include an anecdote about how you couldn't use Windows one time and how it's still worse than Mac.
It's the rule lest someone think you made a bad decision and you're regretting it. Even though it's an OS targeted for your grandmother, you must not let them see weakness.
At this point it's a joke. Either critique Apple or admit you can't without also bringing up some other OS. It's weird.
Agree. Apple needs to clean up shop - MacOS has been egregiously worsening year over year. Some features like Universal Control and Continuity Camera are legitimately awesome, but they do not make up for the INSANELY slow System Settings app that gets harder to navigate with each release and which has >2s wait times for the right pane to respond to a change in the left pane. Steve Jobs would have fired the person responsible for that overhaul three years ago, it's embarrassing. Messages too needs a ground-up rewrite. Getting more elaborate emoji tapbacks doesn't make up for fundamental instability and poor syncing behavior. C'mon!
Absolutely. I love the work they have been doing on the backend, like PQ3 [1], but it just doesn't work for me when the Stickers and Emojis extensions on Mac leak several GBs of RAM and I have to terminate it several times a day to free up memory.
Another thing I dislike is that it stores the whole message history on the device. It's nice to have at times, but I send a lot of photos, which adds up in storage over time. I pay for iCloud, and store my messages there. Why does my Mac need to hold every single photo I have ever sent?
Local iMessage storage is debilitating. I have over 90GB of iMessage history that I don't want deleted. The keep messages for x days removes it from iCloud and the Mac though. Why?
System Settings is awful. Whoever decided to hide tons of settings inside innocuous "(i)" non-buttons should be kept far away from UX design. It's the hamburger menu of macOS.
> Getting more elaborate emoji tapbacks doesn't make up for fundamental instability and poor syncing behavior. C'mon!
Oh but you forgot about the “catch up” button they added 2 releases ago that takes you to the last unread message! …
… but only if said last message is within the N most recent messages, in the messages which are already “fetched” from local storage. If it’s more unread messages than that, the button is nowhere to be found.
Like they said “ok we can implement a catch up button but it’ll be hard to solve due to how we do paging.” “Ok we just won’t put the button on screen if we have to page then. Save the hard problem for the next release.” Then they just forgot about it.
One thing that has been slowly creeping in is a little bit of a Microsoft-like "you will use our feature", like launching apple music every time I hit headphone controls, or nagging me to turn on reactions every time I start a video call. In some ways that's more annoying than the outright bugs, as they could choose not to be that way and market themselves as not being that way.
I feel your pain. I hate pushy upsells and promos. Also the cluttered settings App "Remember to setup Apple Pay" promos. I do value user education. They need to consolidate all of the feature promo services into a revised Tips tool that allow users to engage with new features at their own pace.
As a former Apple employee that left in part due to declining software quality (back in 2015!), and the relentless focus on big flashy features for the next yearly release cycle, I could not agree more.
I recently had to do a full reinstall of macOS on my Mac Studio due to some intermittent networking issue that, for the life of me, I could not pin down. Post-reinstall, everything's fine.
I've explained in another thread how this kind of thing happens. It may be the same at other large companies.
Bugs come in (via Radar) and are routed to the team responsible. Ever since Jobs came back (and Apple became valuable again) it has also become very much top-down with the engineers, for better or worse, not calling the shots.
Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not. That is a "marketing" decision (well, probably Federighi). But further, even for an engineering team, they're probably not going to be able to make that decision even for their own component(s) either. Again, marketing.
So meetings are held and as it gets close to time to think about the NMOS (next major OS) the team is told what features they will implement. Do you think fix bugs is a feature? How about pay down technical debt? Nope, never.
Fixing bugs is just expected, like breathing I guess. And technical debt ... do what you can given your workload and deliverables. Trust me, many engineers (perhaps especially the older ones) want to both fix bugs and refactor code to get rid of technical debt. But there is simply not the cycles to do so.
And then what is even more insipid, the day the OS ships, every single bug in Radar still assigned to a team, still in Analyze, becomes a much much harder sell for the next OS. Because, you know, you already shipped with it ... must not be that bad.
I'd love to see a bug-fix-only Mac OS release. But I suspect that every time the possibility has come up, something like, I don't know, LLMs burst on the scene and there's a scramble.
> Ever since Jobs came back (and Apple became valuable again) it has also become very much top-down with the engineers, for better or worse, not calling the shots. Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not.
It's unclear how much explanatory value this has, because the Snow Leopard that everyone is pining for was during the Jobs era. After all, an Apple that goes bankrupt and out of business isn't going to make any software updates.
I find a stark difference between the Jobs era and the Cook era. Under Jobs, the early Mac OS X updates (Puma and Jaguar) came fast and furious, but then the schedule slowed considerably. Panther was 14 months, Tiger 18, Leopard 30 (delayed due to iPhone), Snow Leopard 22 months, Lion 23. Mountain Lion was the first release after the death of Jobs and came only 12 months after Lion. Thereafter, every Mac OS update came yearly, give or take a few months. That's a drastic change in release schedule.
Yeah, I should be careful to not make it appear as though there were so clear a delineation when Jobs returned. His software engineering team got to work reshaping MacOS (as we know it now) but he seemed to this software engineer to be focused on hardware and "strategies" initially.
Aqua, the new UI, came down from above soon enough. Drawers, toolbars were new UI elements that arrived. In time Jobs' designers were going through the shipping apps with these new UI elements with changes for the engineers to implement.
Certainly by the time the iPhone had arrived the transition to marketing (and design) calling the shots was complete.
Apropos Drawers: The may have looked a little bit silly back then but today almost every Mac app main windows has a big grey sidebar, so that in Exposé view almost all windows look the same. Drawers got an unfair rap, I think.
It's crazy that marketing hasn't worked out that quality and reliability can be spun as a feature. In fact, I remember with OS X, that was the baseline word-of-mouth feature when the comparison was made with Windows at the time.
> Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not. That is a "marketing" decision.
I think it is more that the decision to SAY Snow Leopard was a bug fix-only release was a marketing one. The reality is that release also sported things like 64-bit Intel ports of all apps, added Grand Central Dispatch (e.g. an entirely new code concurrency system) and included a from-scratch Finder rewrite.
I always saw these releases (I bundle Mountain Lion in) were all about trying to rein in excessively long release cycles. Short release cycles tend to not have enough time to introduce new bugs, while extended release cycles create a sense of urgency to get code in under the wire.
Now, release cycles have moved to be staged across a fairly predictable annual calendar. If there's an issue where features are getting pushed out 6 months or a year earlier than they should, that is a management and incentives problem.
I don't even know what these big flashy features are anymore. Every year I get asked by staff "Can I upgrade to <latest major Mac OS>" and every time I tell them they can, but they won't see anything different. There's not even big architectural changes under the hood to improve stability or performance.
Short of it being a requirement to use the latest version of Xcode (once they bump the minimum in the following Feburary), and security updates stopping, there's been very little reason to actually upgrade.
>As a former Apple employee that left in part due to declining software quality (back in 2015!), and the relentless focus on big flashy features for the next yearly release cycle, I could not agree more.
Oh Thank You so much. 2013 I was already questioning on some of the features it keeps adding that were useless. Yosemite with continuity was the only useful feature in the past 10 years.
Yes. relentless focus on big flashy features for the next yearly release cycle was exactly what I felt like it was. And that was the big reason why I dislike Craig Federighi.
Edit: Thinking about it more, former Apple employee that worked during 2005 - 2010 is probably a lot more prestige than post 2015.
- Ever since I've updated to the latest iOS 18, my watch complications(weather doodad) stop working randomly because they just lose the location services permission. Then in settings, the location services permission list acts like the weather app isn't installed.
- The new Mail app now automatically classifies your email, but still gives you the "All Mail" option. But the unread count badge on the app only works off of what they classify as your "Priority" mail. There's a setting to change that, so that it shows you the unread count of ALL mail, not just priority mail, but when you change that setting nothing changes. This is my biggest problem with new iOS.
- Keyboard sometimes doesn't get out the way any more when it should.
These are just off the top of my head. It used to be such a nice, polished experience. Their competition was just outclassed. Now, when my phone dies I'm going to have a good look at all the other options.
> - Keyboard sometimes doesn't get out the way any more when it should.
Depends on where you were seeing this of course, but this could very well be an app problem instead of a system problem.
Native UIKit/SwiftUI do a little bit of keyboard management for “free”, but there are many circumstances where it falls on the developer’s shoulders to do this. For cross platform frameworks, some do keyboard management others don’t even try. For web apps it’s a coin toss and depends on which of the gazillion ways the dev built their app.
It’s not actually that hard, usually just a matter of making sure that your scrolling content either resizes to match the keyboard-shrunken viewport or adding bottom padding equivalent to the height of the keyboard and then and adjusting scroll position accordingly, but it’s not unusual to see this partially or fully absent, especially on poorly built cheapest-bidder-contracted apps.
In modern UIKit it's as simple as constraining to the keyboard layout guide. That gives you full animation support for free as well, no more need to listen for the notification and manually set up animations with the same timing and curve. On iPads the keyboard guide can even help you avoid the split keyboard, it's really nice.
Of course SwiftUI gives you almost none of this control, forcing you to hope the magic automatic support works how you expect.
But then neither help you with any of the other interactions, like any background dimming you may want, or tapping away from the keyboard to dismiss. That has to be done manually.
Absolutely. And turning off Siri's "Learn from this app" should not require the user to navigate to every single app's menu, when Siri has a top level page in Settings.
The division of per-app vs app list in general is bad.
I think they should just throw in the towel and duplicate settings. Meaning, we can turn off Siri learning from an app or from the Siri page. Or we can turn off banners from the app or the notifications page.
my iPhone gets into a state lately where a pane will suddenly lose the the ability to _scroll_. it can happen in any app, but I see it a lot in Safari. Like, what is even happening, this is a fundamental UI interaction. The only way to fix it is to close the tab or force-quit the app. Super weird.
I don't think that's quite right. Snow Leopard was a lot of changes to a lot of the OS code base and wasn't great out of the gate, taking multiple dot releases, like all large-scale software updates do, to stabilize and bugfix enough to be "good."
There is no silver bullet, just a lot of lead ones and the answer to Apple's quality problem is to begin baking QA back into the process in a meaningful way after letting it atrophy for the last decade or so.
Hire more humans and rely less on automation. Trust your developers, QA, and user support folks and the feedback they push up the chain of command. Fix bugs as the arise instead of assigning them to "future" or whatever. Don't release features until they're sufficient stable.
This is all basic stuff for a software company, stuff that Apple seems to have forgotten under the leadership of that glorified accountant, Cook.
> the answer to Apple's quality problem is to begin baking QA back into the process in a meaningful way after letting it atrophy for the last decade or so.
As a former Apple employee of 13 years: Apple knows about the bugs. QA isn’t the problem.
A lot of people complain that their radar for some obvious bug isn’t getting noticed, and conclude that Apple must not be QA’ing, or not dogfooding their own product. This isn’t the case at all. I guarantee the bugs you care about are well known, and QA has already spotted them.
The reality is, they just don’t care. The train leaves the station in September. You’re either on it or you’re not. If you spent the year rewriting some subsystem, and it’s July and you have this huge list of bugs, there’s a go/no-go decision, and the answer is nearly always “go” (because no-go would mean reverting a ton of other stuff too, and that carries its own regression risk, etc.)
So instead there’s just an amount of bugginess that’s deemed acceptable. And so the software is released, everybody slaps high-fives, and the remaining bugs are punted to next year, where they will sit forever, because once we do one release with a known bug, it couldn’t be that important, right? After all, we shipped with it! Future/P2, never to be seen again.
An attempt was made to remedy this by pushing deadlines earlier in the cycle, to make room for more QA time, but that just introduced more perverse incentives: people started landing big features in later dot-releases where there’s less scrutiny, and even more tolerance for bugs.
The honest answer is that Apple needs to start giving a damn about the quality of what they’re pushing. As Steve once said at a pretty famous internal meeting, “you should be mad at your teammates for letting each other down like this”. And heads need to roll. I can only hope that they’re realizing this now, but I don’t feel like the culture under Tim works this way. People’s feelings are way too important, and necessary changes don't get made.
I think some people would be surprised how effective reaching out to apple is for squashing bugs. Three times now I've been assigned an engineer to pin point the bug I was experiencing, after which it was fixed in the next dot release.
By all means people should complain on forums (why not?), but a forum post complaining about some years-old bug isn't going to be anywhere near as effective as contacting apple's support or filing a bug report.
I'm not a developer, I'm just a regular user - so if I can get all this special treatment, so can you.
Yes, I am very surprised to hear that you've had such success with reporting bugs to Apple. That is very unlike my experience. I've had exactly one macOS bug that I reported fixed, and that required going to a WWDC lab, talking to a person on the relevant team in person, and having them dig the bug report out of the backlog for a completely unrelated team that it was incorrectly assigned to.
They would be surprised because it's not true, those years-old bugs in the forums have been reported many times to the official bug tracker, with reference number sometimes posted in those very forums.
Interesting. Apple podcasters frequently rant about what a black hole Apple's Radar bug system is. We're talking hours-long rants in some cases. Luck of the draw, maybe? I'm not doubting you, just surprised to read it.
(It feels similar to how those same podcasters absolutely blast Apple Intelligence, while non-tech users I've heard from seem to love it.)
Adding to this, a solution might be enabling continuous releases and leaning into release channels could help in terms of getting more out to users.
In practice it's a challenge because the OS bundles a lot of separate things into releases, namely Safari changes are tied to OS changes which are tied to Apple Pay features which are tied to so on and so on.
It would require a lot of feature flagging and extra complexity which may reduce complexity.
Another way is to start un-bundling releases and fundamentally re-thinking how the dependency graph is structured.
I think they’re painted into a corner with WWDC. Everything has to be a crowd pleasing brain busting wow drop each year. I’m certain there are teams that design their entire workflow around the yearly wwdc. It honestly feels like an executive leadership problem to solve.
If that is a significant part of the problem, then moving WWDC from an in-person keynote attended mostly by nerds and glanced at by the media to an overproduced movie geared at the media and ordinary consumers first probably didn't help. They could've gone back to a stage presentation after COVID, but some of that transition had already been happening prior to that (I recall an increase in how many jokes/bits they were doing in the late 2010's, although that could just be my perception).
Appreciate the sentiment, but in my humble opinion, seems like they should lean into creating even better automated testing, because adding all the new bugs to their suite of automated tests would be a more certain way to decrease their chance of happening again.
But, in a sense, this still incorporates your idea, because the devs and QA must be given the mandate of finding these bugs, and also towards making the automated tests cover the bug's related test cases (as well as charged with improving the test code itself, which is often in a mediocre state in most code bases I've seen at least).
Well, you can only win playing the stock market (Wall St. is Cook's only real customer) for so long while your products deteriorate. Financializing Apple and eliminating its technical prowess opens the door for the someone else with contemporary technical strength to take Apple's users.
Snow Leopard was macOS moving so slowly people thought Apple were abandoning the Mac.
Apple changed how they tied OS updates to hardware sales in this era and this left a lot of Macs on Snow Leopard for half a decade. So people remember that last point update – which was as close to a low-term-stability release as Apple has ever had.
But to get there, Snow Leopard received 15 updates over 2 years and it was really just point updates to Leopard so it was more like 29 updates over 4 years without a major user facing feature. And this was after Leopard itself took over 2 years to develop.
If Apple did nothing but polish features and reduce bugs for 6 years, people would proclaim them dead. And they might actually be dead since their entire sales model is tied to cycles of development, promotion and delivery. For those of us who remember Apple getting stuck on System 7 between 1990 and 1997 and how the company nearly collapsed in that era: it would be a delay almost on that scale.
It didn’t have anything to do with Sarbanes-Oxley (that was iPhone/iPod touch updates), Apple just charged for OS updates back then.
Snow Leopard was notably cheaper than Leopard ($30 vs $130), Lion was $30 on the App Store, Mountain Lion was $20, then Mavericks and everything after have been free.
Snow Leopard did have a long life though, it was the last OS that could run PowerPC apps, also the last to run on the original 32-bit Core Duo Intel Macs.
This is an interesting idea, and I am actually curious what Apple is going to do going forward. A "Snow Leopard"-esque release would be nice, but I think what would be better is an LTS release. Historically, you get a new Mac and you usually only get 5-6 years before they drop your model from the latest release. This has always made some sense to me, as after 4-6 years, you do start to feel it.
I bought an M1 Max that is now almost 4 years old and it still feels new to me. I can't really imagine a change that would happen in the next 2 years that would make this thing feel slow where an M3 would feel sufficient, so I'm curious to see if Apple really does just go hardcore on forced obsolescence going forward. I have a few M series devies now, from M1 to M3, and I honestly cannot tell the difference other than export times for video.
I can imagine some kind of architecture change that might come with an M6 or something that would force an upgrade path, but I can't see any reason other than just forcing upgrades to drop support between M1-M5. Maybe if there is a really hard push next year into 8K video? Never even tried to edit 8K, so I don't know. I'm guessing an M1 might feel sluggish?
Trying to use Wan2.1 to generate AI video or other various LLM or StableDiffusion style stuff is slow compared to other other platforms. I don't know how much of that is because the code is not optimized for M1+ Max (Activity Monitor shows lots of GPU usage) or how much of it is it's just not up to the competition. Friends on 4070 Windows PC are getting results many X faster and 4070 perf iss not even close to 4090
I don't feel like they ever used forced obsolescence with Mac's. When they dropped support for the latest OS on your machine it was usually because it couldn't run it. I recently updated some older Mac's and even a couple of OS's before support was dropped things got really sluggish. I imagine with the Apple Silicon machines the OS support will stretch longer than it has on the Intel ones. Maybe the higher prices are a hint they expect people to keep the machines in use for longer than before.
> I think what would be better is an LTS release. Historically, you get a new Mac and you usually only get 5-6 years before they drop your model from the latest release
In fairness, Apple to do tend to continue to release critical security patches for older versions.
I suspect that it will be AI features that push Apple into deprecating older hardware. But I also hope that the M series hardware will be supported a bit longer than the intel hardware was. Time will tell.
I don't have any Macs or iPhones that can even run the latest software anymore. My absolute newest Mac is stuck on Ventura 13.7. On the other hand, I can get the bleeding edge version of any Linux distribution out there and run it on decades-old hardware.
Unfortunately, “decades old hardware” doesn’t give me the combination of speed, quietness, battery life and the ability to use my laptop on my lap without so much heat that it puts me at risk for never having any little Scarfaces.
Using an x86 laptop in 2025 is like using a flip phone.
> I bought an M1 Max that is now almost 4 years old and it still feels new to me.
How are the keycaps doing? Mine looked awful after about 2 years of relatively light use, developing really obvious ugly shiny patches (particularly bad on the space bar), quite a letdown on an otherwise great machine.
(Realised that you can actually buy replacements and swap them yourself, via the self-service repair store, so have replaced them once, but am starting to notice shiny patches again on the new set)
Still better than the butterfly debacle of 2016-2019. I have one for work that spends 99.9% of its life docked to a real keyboard and it still has keys that only work sporadically. Some of these keys probably have < 10,000 actuations on them.
Not OP but have the same Mac. Every key is shiny. Doesn't really bother me though because I touch type. Also clearly I favor hitting space with my right hand because only the right side is shiny.
If you have AppleCare they will basically rebuild your MacBook for ~$200. I got MBP M1 Max usb ports and top case replaced and a bunch of other stuff I didn’t even ask for but they replaced with new stuff. Felt like a new machine when I got it back.
They need to somehow start marketing effectively to gamers, because the GPU in your M1 Max is shit. Sure, it’s fine for mostly-2D UIs and the occasional WebGL widget, but for AAA gaming it’s just dogshit.
'Gaming laptops' with more powerful GPUs are generally awful, though. Even ignoring the state of Win11.
Yes, they can theoretically perform better, but only when plugged into mains power, and creating so much heat and fan noise that the experience really isn't good.
Don't think there's anything out there that will outperform the GPU of an M-series Mac without consuming way more power and producing problematic levels of heat+noise.
Sure, but this is another avenue to onboard people to the upgrade train. Sure your display is great, your CPU is great, the speakers are great. But the AAA graphics scale up every year and there are often big performance cliffs for new features on old hardware.
Interesting take. I'm mostly not affected by that because I use except from the OS itself nearly no Apple software to be not trapped in the Apple golden cage ever. No photos, no Apple mail, no Apple maps, no Notes etc etc. and/but I also use no iPhone. But system settings is awful, at least I can search there to not wrap my head around it.
I actually see progress in things that matter for me as software dev like virtualisation and Docker support. And with frameworks like MLX I can even run image generation tools like FLUX locally on my Mac (search for mflux). Amazing! And Apple Silicone is a screamer... still cannot believe I have the fastest single core PC on Earth in my laptop.
I only thing I use is the calendar to see my personal and work Google calendars aggregated at the same time.
So far I'm happy with macOS. If the whole graphics industry (Adobe etc) would support Linux more I would even switch away to Linux but because I'm dealing with photography, color correction and a little video too I will never switch to Linux (the graphics system quality in macOS is way too good). Windows is unfortunately no go too because of the built-in spyware and ads in the OS (like WTF).
I consider Apple Intelligence also as a sort of spyware. I don't want to activate it ever (but it gets auto activated after updates) and I don't want it to download its stuff and waste space. If people want to use it: fine, but if I personally opt out, I opt out fully Apple!
> system settings is awful, at least I can search there to not wrap my head around it
When it works. Last time I typed “keyboard” in the system settings app, the keyboard settings weren’t part of the results. Ditto “mouse” or “trackpad”. Settings search has been utterly broken on around half of the dot releases for me. If it works, it’s only temporary and then it’s back to not working on the next update (or even reboot.)
Working for two companies I see how in the small one people manually test their changes, try to break them, even having in-code tests. At the big corpo - noone cares. Tests are green? Release to prod, close ticket and take another. Clients complain? There are 5-6 layers of people before such complain can come back to the team.
I wouldn't agree with "less glitchy" than Windows. Currently Win10 is the best one if it comes to stability, but Microsoft is already killing support for it. Windows 11 have problems even with typing into Start Menu search - basic functionality. Randomly takes input or not. So I think we are lowering the bar and the market agrees how low it should go.
It absolutely does. There are so man quality of life issues that plague the platform that don't get addressed year after year. I'm sick of albums syncing to my phone losing their artwork. With Sequoia, I'm sick of running multiple network extensions (you know like Tailscale and Little Snitch) causing network issues.
When I started using OS X, one of the biggest draws for me was first-class native keyboard shortcuts support that was consistently followed and applied by all apps (first party and otherwise). So you could be sure that a shortcut for search across all contexts (global) would work just as well as the shortcut for a contextual search within any app. No one writes great third-party native apps anymore and even Apple's own apps completely disregard this part of their heritage. Just try searching across the AppStore, Apple Music, and the legacy Finder.
For newer Apple apps, sometimes the keyboard shortcuts simply don't exist. I believe part of the problem here is the deprecation of AppleScript, which means there's no incentive to spend time on consistency, and the other part has to do with organizational indifference towards all the wonderful UX innovations from the past.
What Apple has successfully accomplished, in collaboration with other 'big tech' companies is drastically reducing user expectations from their software. I wouldn't completely blame the AppStore's forced race to the bottom for this alone. There is still a huge market for tasteful apps that cost more (even sometimes with obnoxious subscriptions), but if even Apple isn't leading by example, why waste time on it if you could just build another simple note-taking app.
…and speaking of Snow versions, bring back those cool welcome videos when you first purchase a Mac!
I miss those. Unfortunately, since Apple doesn't do the whole space theme anymore, you'd probably get some really boring drone shots of California at best before a Setup Assistant faded into view from behind a Redwood or something.
With the hopes that Apple engineers are scanning this discussion:
- Using the iPhone to scan documents from Finder has recently stopped working on the second scan. I need to restart my phone to get it to work again.
- iPhone mirroring is terrible: laggy, UI glitches, drops click events, scrolling is a nightmare. This is when it actually even manages to connect.
- Often, with Airpods on, lowering the volume, shutting down the iPhone display and putting it in my pocket quickly enough will entirely turn off volume. If you happen to increase the volume instead, you'll get blasted with maximum volume in your ears.
- Use vertical tabs on Safari for one day. You'll see it actually crash a few times. Not to mention the UI glitches.
- Open the App Store on macOS. It first opens empty, then the UI controls show up, then it flickers the entire UI. I am convinced it's a Web app.
- In System Settings, most of the sections you click have a delay in rendering. Nothing feels snappy in that app. I can actually click 3 sections quick enough for the second to never even be rendered.
- Sometimes dragging an application from the Dock popup menu into the Trash does nothing, even though it appears to have worked. I often find that it wasn't deleted at all, that I have to open Applications folder in Finder and hit Cmd-Backspace to delete it.
Good idea. I’ll add some that have annoyed me for years just in case:
- On iOS, the alarms app breaks down once you get to ~250 alarms. You can try to add/delete alarms and it’ll appear like they changed, but the change wont be saved. I can’t use the alarms app now and can’t fix it as I can’t delete alarms. By the way, would be nice to reuse alarms when creating at the same time as an existing alarm so you don’t end up with 250+ alarms in the first place.
- On iOS, the notes app breaks down in long documents (~10 pages of text with bullet points). When writing beyond that, some text will sometimes disappear only to reappear when you type some more. Other times, the cursor disappears. This only happens in long documents. All English text, mainly bullet points, often with some text pasted in.
It’s shocking to me that my iPhone 11 Pro can play gorgeous 3D video games, but can’t handle 250 alarms or 10 pages of text..
I feel like they're trying to build too many platforms most of which have become quite large. macOS, iOS, iPad OS, visionOS, watchOS, tvOS. The fact all of these systems are quite tightly linked in terms of features/syncing makes it difficult to navigate. If you want to ship every single year you need more developers, but that might make the collaboration between the systems more difficult. They need to move away from the one year cycle. It's a stupidly short period of time to ship a whole OS (or 6 whole OS's). If you want to keep them all in sync switch to two year cycles and decouple some of the apps from the core OS (e.g. Music, Safari, etc) so they can be updated as necessary outside of the cycle.
The platform teams at Apple don't really work that way. My (limited) understanding is that they share a fair amount of core code, but each went their own way for a while and have recently started getting nudged back together from a UI perspective -- unfortunately, the iOS style guide seems to have won, and many decades of desktop UX is being thrown out with the bath water.
Apple needs to make all of the accessory apps (photos, music, news, maps, mail, etc..) uninstallable and able to be added later if needed through their app store.
Every MacOS update brings along this bloatware that is not easily removed.
That's possible in EU (prob also in EEA). Of all you mention only Photos shows a dramatic sliding modal asking you if you want to remove it; data library will stay but features like hidden, recently deleted photos and "memories" won't be available.
Most mac os feature updates are just updates to all that bundled stuff. I use none of that. I use my laptop to run various OSS developer tools, browsers, etc. 99% of it is available on Linux. And I have moved my workflow to a Linux laptop a few years ago. I went back for performance reasons; not for feature reasons. I can do that again. There's nothing really stopping me. But I like the Apple hardware.
This is also the reason that I don't mind the current version of Mac OS. Yes everything you mentioned is a bit meh. Which is part of why I don't use any of those applications. So I don't care. I've disabled Siri. Never used Facetime. Maps, Numbers, and all the other of the dozens of things they bundle: I never touch any of it. I don't need that stuff and when I do, I use alternatives. I have an Android phone so all of the IOS integration stuff is redundant to me as well. They've not locked me into their ecosystem. And I like it like that. I don't allow myself to be locked in.
As a work horse for doing development MacOS is still a fine OS. It does the job. Most updates of the last 10 years or so have been minor window dressing that you barely notice, some under the hood changes, and misc tweaks that mostly fall into the "whatever" category for me. For me the annoying thing is just having to sit through these lengthy updates. I keep postponing them because it's never convenient to take an hours long break when it prompts me.
And I don't really get much out of these updates. To be honest, I can barely tell apart the different versions of their OS. The main notable visual change seems to be the desktop background. Which is usually hidden by applications. So I rarely look at it.
That way I can go and buy a MacBook Pro (13"?), or a MacBook Max Pro (15"), or a MacBook Mini (11"), or a normal iPad Mini Ultra, or an iMac Mini (21"?), or a Watch Pro, or a Mac Max Ultra etc.
I agree. It seems ridiculous that an app like Messages is considered so much part of the OS given what it does. I don’t use it, I don’t care about it, but it seems like it could be a regular app that updates independently of the OS, along with Maps, Notes, and so on. So many macOS “upgrades” nowadays seem to be Apple tinkering with such apps rather than the actual OS experience.
I did said similar thing weeks ago regarding all shenanigans with W11: every software should include two paths of installation/OOBE: default "express" where vendor shoves you "the experience" and customized "expert" where you select features YOU want. And either way allows you to change system afterwards. We had that in Windows years ago but then it was removed; some Linux distributions do offer package selection beyond the default set.
There's no need for a separated core version - just give back control to the user. But honestly, I don't know what would need to happen so we could get it - it feels like it's a lost cause against corporations. There's of course Apple-EU situation where you can remove applications, set defaults, install additional app stores but this is still limited to that market and happen way too late and too slow.
Why do you want to destroy computing for everybody else just to make a very small group of hackers pleased? Can't you be satisfied with Linux or Windows or BSD? Let normal people have at least one platform that is usable for them.
How is letting users to disable bloat a bad feature?
For example, can you remove Chess from MacOS? Nope! Why? What I found on Reddit, it seems because it's integral part of MacOS somehow and I am a bad person for even asking, somehow.
I'm still waiting for them to remove Launchpad (which seems like a half assed step towards unifying desktop and tablet systems), and I've yet to meet anyone that uses their weird new desktop management system, the thing with the windows on the left side. That just reminds me of the GUI experiments they did in the 2000s, with 3d environments and whatever Ubuntu (or gnome/kde/whichever) tried to do.
I'm hoping they're gathering usage analytics and will overhaul unused features.
Caveat, I'm probably not their average user, I do almost everything via Spotlight. I don't even use the bottom menu thing, it automatically hides and I only use it when I accidentally hid a window.
Launchpad has to be one of the worst ways to find and open applications on a Mac. That new window-managing system is honestly so unintuitive, so bad, and so bizarre that its mere existence feels like some sort of practical joke.
I wish that in the next version of macOS, they would strip away all those useless features and systems that they've shoehorned over the past two decades and have the OS look like how Panther or Tiger did, while taking up less than 10 GB of space on the puny SSDs that they ship their machines with.
I appreciate that they did UI experimentation and stuff but... not to the end user's expense. I wonder if anyone at Apple themselves actually use these features.
> In the 22 years since I became a “switcher”, this is the worst state I can remember Apple’s platforms being in.
Indeed, I remember three times when Apple went a bit overboard on the feature front, but dialed it back and made some of the most stable and useful OS versions:
OS 8.5/8.6 pushed a bunch of features and were the last big pushes pre-OSX, but then OS 9 fixed a TON of bugs, and added a few smaller quality of life improvements that made running 'Classic' Mac OS pretty good, for those who were stuck on it for the transitional years.
Mac OS X 10.0 rewrote _everything_, and especially 10.0 was _dog_ slow, with all the new Quartz graphics stuff in an era where GPU accelerated 3D display widgets wasn't quite prevalent. 10.1 patched in a bunch of missing features (like DVD Player—it was still a pretty useful tool back then), and fixed a couple of the most glaring problems... but 10.4 Tiger was the first OS X release that was 'fast' enough OS X was a joy to use in the same way OS 9 was at the time. At least on newer Macs.
And then of course Snow Leopard, which is the subject of the OP.
macOS 13/14/15 have progressively added more little bugs I track in my https://github.com/geerlingguy/mac-dev-playbook project; anything from little networking bugs to weird preferences that can't be automated, or don't even work at all when you try toggling them.
That's besides the absolute _disaster_ that is modern System Preferences. Until the 'great iOSification' a few years back, Apple's System Preferences and preference pane were actually a pleasure to use, and I could usually remember where to go visually, with a nice search assistant.
Now... it's hit or miss if I can even find a setting :(
Settings is not that bad. It's _awful_, yes, since it broke the panel design we had since the NeXT days, but for me the real annoyance is the way Apple progressively, inexorably broke desktop automation to a point where they now effectively painted themselves into a corner regarding having enough of a foundation to make Apple Intelligence useful (https://taoofmac.com/space/blog/2025/03/14/1830).
That said, I expect things to get worse as they manage to converge their multiple platforms in exactly the wrong way (by dumbing them down across the board even as people keep hoping they'll make iPad OS more useful, etc.).
But at least we still have Safari, Apple Silicon is pretty amazing and I can survive inside Terminal and vim. For now.
Mac OS X versions before Jaguar supported GPU accelerated applications, but the windows were composited in software which caused severe performance problems. Jaguar introduced something called Quartz Extreme, where the windows are treated as OpenGL surfaces and the window contents are textures mapped onto the surfaces. This made OS X significantly smoother on computers with a fast enough GPU and enough VRAM to support it, as the CPU didn't have to spend a bunch of time copying all the window contents to the framebuffer.
I don't think it does. Long term Apple user here (since 2007). I'm typing this on a 5 year old pile of junk with Windows 11 LTSC on it. The (M4) Mac is sitting next to me acting as an SMB server until I can be bothered to get all my stuff out of it. It's just tiring using a Mac these days. It's difficult to explain but everything feels slightly frustrating. The nice things are really nice. The whole experience is quite nice. Until you hit a problem. Then it's a complete pit of pain and misery and there just aren't enough ways out of it.
Had a few issues with iCloud syncing and data loss as well and what with being based in the UK and the general problems with geopolitics and the cloud I figured I'd try and get as much stuff out of iCloud as possible. Well there's not much advantage now. Most of it is in the ecosystem tie in, not the hardware. And on top of that the provisioned services such as Apple Music are just pain for me on a daily basis. My entire music catalogue disappeared in a puff of smoke when I was offline for nearly a week. The one thing I wanted it for!
So back to the PC. I ran out of disk space on the (soldered in SSD) Mac. I can't delete anything and macOS has leaked out about 20gb suddenly. I don't know what this is other than about 5 gig of it is Apple Intelligence despite telling it to fuck off. So it's late Friday afternoon and I need to get something done so I can have a clear weekend. I dig in the junk cupboard and find a couple of hard disks but no way of connecting them to the USB-C only Mac. Amazon solutions aren't available for delivery until Sunday. There upon I discovered the kids' "covid work PC" for when they were home studying. Despite the acceptable 16Gb of RAM it only had a meagre 256Gb disk in it. No worries. Opened it up and there's a hole for an SSD in it. It now has +500Gb SSD. Brilliant. On goes windows 11 LTSC. I'm back up running R in under an hour and have transferred all the data over.
I never went back. It feels better here. This thing is a swiss army knife. And extension of me. Not the other way round like on the Mac. The Mac feels like it feeds off me: both cash and energy. Apple need to fix that.
That would be medium-term user. Long-term would be people like me that have been using it since 1984.
I have a collection of Macs going all the way back to 1984. Even the newest one hasn't been turned on in three years.
My daily driver is Windows Server 2016. But it has VMware Workstation so there are lots of virtual machines for my work, including Linux.
I am so tempted by the new M4s. Amazing piece of technology. So sad about the operating system though. Every year I say I'll wait for a quality Linux port.
The “Snow Leopard effect” is more about the transition to Intel from PowerPC than the OS itself.
And maybe I’m a minority but the latest macOS is not worse than previous editions, for instance I use Sequoia on a M1 Mac but also 10.4 Tiger and OS 9.2.2 on a PowerMac G4 (MDD, 2x 1.2Ghz with 2Go of RAM) and the stability is not worse on Sequoia than Tiger or 9.2.2, in fact I have encountered more crashes in 9.2.2 and Tiger than Sequoia and all macOS 11+ (except Big Sur who has rough edge on beginning on M1 device)
TL;DR What people remember fondly is not Mac OS X 10.6.0, which was in fact very buggy, and buggier than 10.5.8, but rather later versions of Snow Leopard after almost 2 full years of bug fixes.
The yearly release cycle is the problem. Apple needs "another Snow Leopard" only in the sense that I mentioned above, "almost 2 full years of bug fixes", although at this point, Apple has more than 2 years of technical debt.
Thank you, the nostalgia for a 15-year-old OS release, which absolutely was not great out of gate, is strange.
My recommendation for people who don't absolutely need the latest features: Upgrade to the previous version of macOS when the new version is released. Sequoia is incredibly reliable 7 (soon to be 8) updates in.
I disagree. From the article: "The same year Apple launched the iPhone, it unveiled a massive upgrade to Mac OS X known as Leopard, sporting “300 New Features.” Two years later, it did something almost unheard of: it released Snow Leopard, an upgrade all about how little it added and how much it took away. Apple needs to make it snow again. Snow Leopard did what it was made to do. It was one of the most solid software releases Apple ever put out." This gives the impression that it was solid out of the gate, which it was not. And the next paragraph specifically mentions "2009’s Snow Leopard". But later Snow Leopard releases were in 2011.
Grandempire is right on my overall sense in the piece, though perhaps I should have made its ore explicit. I actually faired quite well with 10.6.0. But, the lack of push for a yearly set of headlining features did allow the OS to age quite well in the years after, too. It's the drumbeat of what 10 stunning new features will be unveiled each WWDC for each platform that means past features rarely get the continued polishing they need to shine.
In my own experience, I have noticed that Apple's software 'breaks' more on older hardware, be that Mac's, iPhones or iPads. For all the credit apple gets for supporting older devices, those devices are definitely not treated as first class citizens. For example, the touch keyboard on my (work) iPhone 12 Pro works decidedly worse than on my (private) iPhone 16 Pro. The error rate is much worse, and I believe it's due to the amount of useless features that get added with each new installment of iOS.
Whether that's intentional or not (I believe it is), Apple should focus more on delivering a stable experience, on both new and old devices.
I echo the sentiment a lot of people have already expressed.
That is, using Apple products is like being a junkie. You need to use their products because there is no real alternative, but you feel kind of dirty because of their practices.To me, that sounds like it should be a huge red flag for Apple execs.
Adding my comment as reply as well as it is relevant:
---
I've been holding over and running 10.5 on my iMac 2019, but then in the beginning of the year had to upgrade to Sequoia (due to software dependencies).
Of course this is just a correlation, not necessary a causation, but within a month the iMac's internal SSD was corrupted to the point that it was unrecoverable, and my 40GB RAM corrupted.
So, yeah, at the very least not sure how much testing went into Sequoia for non Mac Silicon macs.
Quite disappointing considering how long a normal Mac's lifetime used to be, which also justified its high initial hardware price.
This is not only applied to Apple's software. The entire software and hardware market including iPhone, Samsung Galaxy, Windows, etc. is pressured to release new products with more and more features every year, advertising those new features to facilitate sales. The result is, what was once a simple and cool product has become heavily bloated with unneccessary features.
The Nero Burning ROM/ACDSee disease is how I like to call that. These were simple once too but quickly degraded on quality, got bloated with stuff nobody ask in the first place
I think it's different from country to country, here in Sweden I think iMessage is reasonably popular, and people here generally go to Facebook Messenger rather than WhatsApp when it comes to cross-platform communication.
iMessage is a very limited and glitchy app when it comes down to it.
Just from the top of my head: no E2EE by default. Gifies are restricted (and censored). Reactions are clumsy (there are two rows of different kinds of emojis to choose from now). Adding photos or sharing location is complicated compared to Signal or Whatsapp.
Search is ... well, I hope you don't really need to find anything. Delayed notifications on macOS for no apparent reason, and in 2025 you can still end up with multiple entries for the same contact...
If we look at OSX Releases [1] ; from OS X 10.10 Yosemite in 2014. The only useful feature for me was Universal Clipboard. That is 10 years of macOS and that was about the only user features.
While the 10 years have some security, performance, drivers, file system, refactoring going on. Most of the user features were useless.
And I spend 90% of my time inside Safari, and yet Desktop Safari is still shit after all these years.
I am not excited about 99% of new macOS user features. Most of them are features for features sake. Just continue the macOS engineering work, and for once pour more resources into Safari and allow Safari support on older Mac system.
This alone says a lot about Apple's software "prowess", i.e perennial customer hostility combined with clear incompetence, (in which their "core" customer base has by now becomes participants in some kind of Stockholm syndrome scenario), that an attempted de-shittification of their OS is being hailed as (nostalgia tinted?) greatness :)
> Apple’s iMessage and SMS tool is an essential app for communication for me and, I suspect, the vast majority of Apple users.
For the majority American Apple users, sure. But I myself hardly ever remember that this app exists.
The thing that drove me nuts in particular in Sonoma though, is their "improved" text fields. Where it would show the stupid little popup with the active keyboard layout icon next to the cursor. Clearly made by someone who doesn't actually need to use multiple keyboard layouts (gosh do I envy those people). But at least I could disable it with a defaults write command.
Oh and Mail, yes, it would sometimes stubbornly refuse to load new messages, or delay them by minutes. It worked fine the previous 10 years. It would've been free to just leave it alone.
> Oh and Mail, yes, it would sometimes stubbornly refuse to load new messages, or delay them by minutes. It worked fine the previous 10 years. It would've been free to just leave it alone.
Oh man, Mail is almost comically bad, to the point that I occasionally miss messages from people since they're drowned in crap. A native version of Google Inbox that is not Google-owned would be enough for me. (or whatever version/implementation that integrates nicely with my devices)
> But I myself hardly ever remember that this app exists.
As a counterpoint, I myself use it everyday. I’m not American and most people I know don’t have iMessage. I still prefer it to using SMS from the phone. And yes, I do agree with the author that the app is buggy.
But I also never chat over SMS with actual people. It's just not done any more by anyone I know. The last time I sent an SMS was probably several years ago. It's 99.9% various confirmation codes and other notifications for me.
Last week, I switched to a Mac for the first time in my life after using Windows and Linux for around 30 years. Naturally, I hate a lot of things due to old habits, and the shortcuts constantly confuse me. But what really surprises me is the number of obvious bugs in common workflows. At least five times a day, I catch myself thinking, "There's no way this is actually broken." I didn't expect macOS to be even buggier than Windows.
That said, the hardware and the absence of Windows' user-hostile nonsense bring me endless joy. I don't think I'll go back to a PC (the Mac feels like a different class of quality) but to be honest, I expected more.
Off the top of my head I'm mainly thinking about `cut and paste` in Finder, that's a very common one people complain about, but other than that I'm curious what you're referring to if it's happening five times a day with new things, any chance you could outline some examples?
Examples just from today: Window snapping (or whatever it's called on mac) stops working until restart, keyboard type detection gets broken because it thinks my mouse is a keyboard so suddenly " and > are replaced, title bar disappears then the apple logo is halfway off screen when it randomly comes back.
I took a Mac at my current job since I really don't like Windows and I figured I would probably be able to hack it. I use Linux for all my personal stuff, all I need is bash and a browser, yeah?
Pfft. Nothing works, and a patronizing, laggy OS that actively tries to fight me at every step because it knows better than me.
What a joy. I'm sticking with Ubuntu/Fedora and having to figure out a driver issue every once in a while.
It is literally insane that when I search for Photos on iOS I can't zoom in to make the thumbnails bigger. As an approaching mid-40s person this is untenable, even worse that it DOES let you zoom in prior to search.
Photos definitely regressed in last release. I like change and new things (AI searching/tagging photos is extremely useful) but when they changed it I realized how important my muscle memory was for that app and features like pulldown iCloud sync/status seems to be gone and other small things changed in annoying ways.
The lack of filters in things like Photos or the iCloud version baffles me. Tools that would be effective and far more useful than half of what they add instead.
I find it incredibly developer hostile as an OS now. I don’t want to have to type a password in to use a debugger. I want to be able to download software and run it as I want, whoever wrote it, without them having to sign it. All that does is push people away from supporting Macs, particularly if they’re learning and don’t want to shell out £99 for a developer license. And you can see that because the Mac ecosystem has become dramatically less varied and stagnant.
For me, I hiched my wagon to the Apple team, years ago, and have held on, through some truly disastrous times.
I can't predict whether or not they will get past this, but I'll keep hanging on, anyway.
The code quality (the bits they let us see), however, seems to be going downhill, as is the quality of the documentation. These are things that always held up, in the past.
It's fairly discouraging. I suspect the quality of their hires has been going down. I'm not sure what it is, they want, but it doesn't seem to be quality.
Hard to disagree. You would think for a company obsessed with performance per watt and battery life that every release would be as fast if not faster that its predecessor and more efficient to boot.
What Apple needs, is to fix that weird bug where my mouse cursor stops responding to what it's hovering over. How does something so fundamentally broken make its way into an OS?
I like how my cursor randomly enlarges itself for like it's moving between a high-DPI and low-DPI mode despite being entirely on one display and nowhere near the edge between my internal and external. It gets big for maybe 1 second and goes back to normal.
> It gets big for maybe 1 second and goes back to normal.
That sounds like the "shake to locate" behavior and I would be totally lost(heh) without it since I have 2 4K monitors plus the onboard display, and that black cursor gets lost very easily. Shake the mouse, get big cursor, find cursor, be happy
Oh, neat. I had no idea it was intentional but I can see how that would be a useful feature. It felt like a bug to me because it felt disconnected from any intentional cursor movement on my part.
I don't think it's just an Apple thing. I think it's just a big company thing. For example, the YouTube app has so many errors in the very common path, such as opening comments on channels and so on. I think after a while big companies simply become hollow from the inside and self-combust. Just like large animals have a cancer protection gene, I think there is a max size companies can get before they sell combust and they do not have a cancer preventing gene.
The article is spot on and articulated my feelings exactly. I too became a loyal Apple supporter nearly 20 years ago because "it just works". Sadly, I no longer feel this way. The operating systems on my Macbook, iphone and iPad have consistently gotten worse with each update over the last 5-10 years. Apple is losing the magic on the software front.
> I am not suggesting Apple has fallen behind Windows or Android. Changing a setting on Windows 11 can often involve a journey through three
Love how you can't find a critique of Apple without the person feeling the need to throw shade at Windows. They need to constantly reassure themselves and other fanboys it was the right decision.
And for an OS that's geared to your grandmother it sure does seem to shit the bed often.
You could argue that macOS development is too slow, not too fast and in need of a maintenance year.
Basic OS features have fallen way behind in term of UX - and of vision. Managing files and searching for information have become a chore compared to most internet- or llm-based services. Even a bug-free Finder or faster Spotlight would not bridge that gap.
All apps listed in the article feel similarly lost behind - Mail, Messages, Photos. The only exception is System Settings that does definitely need a snow version.
This is obviously true for other platforms as well.
We are possibly lacking a leap forward. Not faster horses, electric cars.
An obvious root cause of this is the lack of newcomers to the OS again. It's an oligopole that has no interest making things much improved.
> In an era when people still paid money for operating system upgrades every few years (anyone else remember standing in line for Windows 95?)
No, and I would have been too young to purchase it.
But I'd be surprised at the idea of massive demand for an upgrade to Windows 95. What we did was buy a new computer that had Windows 95 on it. Computers used to go out of date very quickly.
We kept our older computer that ran DOS. (It had Windows 3.1 installed, but the only reason you'd start that was if you wanted to play Solitaire.) It continued to run DOS just fine.
Believe it or not, it was a huge deal. I went to two launch parties -- one hosted by CompUSA and one by a local place -- the night it came out. This wasn't in Silicon Valley, but in the U.S. Midwest (St. Louis, MO). Hundreds of people stood in line at midnight to get the first copies at the two places I went to and the same thing happened all over the world. (Of course, CompUSA also had a whole display of upgrades to get your computer running better if it wasn't ready for Windows 95.)
The same sort of late night excitement existed around each early Mac OS X release, incidentally.
Well, computers were expensive enough many of us just kept upgrading parts to make something like Windows 95 work. My 486 slowly morphed into a Pentium system with largely the same parts over the course of three years during that time. But Windows 95 worked great on my 486 -- it felt like a great upgrade at the time.
I know it doesn’t affect a lot of people, but pasting in hex mode in Calculator broke in Sequoia. Previously, any number pasted in hex mode was treated as hex (as expected) even if the number consisted only of decimal characters (say 20, which would be decimal 32). Now numbers pasted with only decimal characters are treated as decimal (pasting “20” turns into 0x14) and numbers with with at least one alpha hex char is treated as hex. The workaround is to prefix the number with “0x”, but that’s not always practical.
Snow leopard was my favorite operating system ever. I used it on my first real computer, a horrible Asus eeepc netbook and it worked flawlessly. Best hackintosh I've ever used. Of course I used it on official hardware as well but it brings back fond memories.
It's not even that Snow Leopard was so great. It's that what came immediately after was so poor. Lion was noticeably janky. Things seemed to improve again with Mavericks becoming quite stable after numerous point releases. Then there was a glimmer of hope around High Sierra/Mojave/Catalina, but since then it's been steadily downhill.
Huh, I was actually on this page a few years ago, but iOS and MacOS quality has been super solid for me this past year. Anyone else feel this way? Judging by the nodding comments maybe I’m just the outlier?
I've been using OSX / macOS since 2002. I've not really had many issues if any that I can remember or found noticeable (or knew they were attributable to the OS). I can't really use Windows or Linux because I've been quite accustomed to the incredibly useful accessibility tools that come with OSX / macOS which are first class, and probably worth a whole lot more than I paid for the hardware.
Wow, that 2013 WWDC video is so incredibly impactful. I had no idea I was going to experience what I did when I hit play. It resonates with me so strongly, I honestly wasn't ready for it.
You're right. When I first watched it, I was under no doubt they lived and breathed that philosophy. It matched my perception of their output 100%. Watching it again now, I'm reminded of how I used to feel and how much things have changed.
Yes. I remember it strongly hitting me back then, but rewatching added even more punch. I still agree with the philosophy, but this time I was also wistful for when I could say Apple agreed with it, too.
I have two Apple TV 4Ks and both started dropping Bluetooth connections every minute or so after a recent upgrade.
My headphones will cut out and when I go to pause the video I’ll be clicking frantically because the remote isn’t working either. Or I’ll be in the menu and the remote will pin to the left or right and scroll to the end of some massive YouTube list.
Reboots, resets, nothing fixes it.
My Apple Watch regularly has a glitched Home Screen too.
I defended Apple’s quality recently, right before everything started breaking for me.
I thought it was because of my MacBook pro still being an Intel One and thought that nobody at Apple cares about that anymore. But it seems also the M family suffers from it.
Mine doesn't really sleep. It's always warmish despite all my best efforts to make it actually sleep. It's always plugged in, so no biggie, but it's annoying as hell.
Reddit wisdom says it's because of my usb peripherals, but it's just a webcam, mouse, keyboard, and a yubikey.
Absolutely agree. This week, I can't get Chrome to connect to local servers.
ERR_ADDRESS_UNREACHABLE it says.
Yes, I said Yes to the new permission. Yes the check mark is on in Privacy, I mean all 20 of them that say "Google Chrome". Yes I toggled it off and on. Yes I rebooted. Still have to use a different browser to access my own local server because there is a new privacy feature that... doesn't work.
> Yes, they work and are still smoother and less glitchy than Windows 11, but they feel like software developed by people who don’t actually use that software
I would have to agree here (and Apple also don't seem to assess feedback for their GUI changes), but unfortunately this thread is already on a software quality meta tangent rather than listing individual annoyances so here's my short list in the hope actual bugs can be discussed:
- window focus management broken: when you minimize or close a window, another random window of that app you're closing the window of is put into front even when that window is minimized; or other completely unrelated apps get focus
- index/Spotlight not showing file locations (full paths) after searching; the fsck?
- gestures being introduced that do stuff that you hit inadvertently and leave you in a state where you don't know how to undo its effects such as the super-annoying "fullscreen" mode when dragging windows around or
pressing Command-F since Sequoia. Requires you to fscking
research how to leave fullscreen mode (while not as cringe as
Windows help "communities", the level of talking past another
is getting there, options being discussed that don't exist in Sequoia's Dock/Desktop settings)
- update or feature nagging (I don't care I could use my iPhone as a webcam right now, go away)
- sometimes difficult to find mouse pointer on large screens
- older problem but I know at least one person on the verge
of leaving Mac OS because of it after 20+ years of loyalty (or outright fanboyism tbh): in a German locale, you can't switch off PC gender-neutral language which is not only pushy and annoying but also space-inefficient as fsck
Although I do not mind the way that window management works on macos, recently I had a mildly infuriating situation. I was doing Cmd+Z to undo something, not sure which app and it didn't work so I pressed it a couple of times instinctively. But although my target app was visible and on top, it was Finder that was actually in focus - accidentally I triggered undo in Finder. I think I managed to undelete a file and something else, but I'm not sure. Not sure if there is a way to find a log of actions. That's something I would love to see in all desktop systems, a history of user actions. Also having undo/redo shortcuts in Finder is potentially destructive, what if I move some files from an SD card, reformat it in camera, and then accidentally hit undo in Finder?
> moving the Mac to a new processor architecture (for the second of three times)
Four times kinda — maybe five if you want to count PPC32 and PPC64 separately but I usually don't since the Intel transition happened so soon afterward that there is really no PPC64 lineage to speak of.
I definitely count 32-bit and 64-bit Intel separately though due to the number of years taken to transition, all of the annoying early-Intel-Mac 32-bit EFI issues, and the need to manually opt in to the 64-bit kernel on many machines. In fact Snow Leopard was the first OS to let you do so! The “no new features” tagline was snappy but it's really not true at all :p
“Mac OS X Snow Leopard and above could only be installed on Macs with Intel processors, and introduced a number of fully 64-bit Cocoa applications (e.g. QuickTime X), and most applications were recreated to use the 64-bit x86-64 architecture (although iTunes was a notable exception to this!) This meant these applications could run in 32-bit mode on machines with 32-bit processors, and in 64-bit mode on machines with 64-bit processors. And yes, the kernel was updated so that it could run in 64-bit mode on some limited hardware, and only by default on Mac Pros, while other newer Macs were capable of running the kernel bit did not do so by default.”
The surprising thing to me, is that I have been a Mac user since forever, I think Leopard was my first OS. Things have barely changed since then, there have been some subtle redesigns since then, but the desktop has remained largely static.
I don't understand why macOS has so many issues. I still encounter memory leaks and have to kill Finder or Dock every few days lest it eat all my memory.
No you're not. This thread feels like a nostalgic crowd who idealizes the past and falls into the “it was better in the old days” trap, letting go of the myriad of things that improved since then.
Look snow leaopard actually added the hugest feature ever, grand central dispatch.
That’s what billy always dreamed about (concurrent windows) and that is what rust craze is about, adding the same to Linux/windows.
Like Apple said, snow leopard was under the hood changes.
So don’t you worry you will get your snow sequoia, the ai reorganisation is exactly that.
How does Apple Music not have an equivalent to Spotify Connect?! Renders Apple Music unusable, and no, we're not talking about Airplay, and no, we're not downloading iTunes Remote (can't believe it still even has 'iTunes' in its name!).
> I could walk item by item through System Settings and point out many equally inexplicable decisions. Did anyone at Apple really believe a Mac user’s life would be better if common features were buried deep in menus?
I have to agree with this, System Settings seems very inconsistent (design) and has terrible information architecture / organization.
Speaking of, I just bought a brand new M4 Air. The thing is amazing, except I swear that Command-Tab does not work consistently sometimes, it just does nothing and I have to press it again. It's baffling, has anyone had this before? Never had this issue on any computer in the past 20 years, it's strange.
Linux Mint is quite good, I've got it on a few machines. So far it's the most stable, easily-updatable and "gets out of the way" Linux distro I've used in years.
Another example was High Sierra. They completely swapped out the file system on that release, focusing primarily on under-the-hood changes, and imo was also one of the most stable macOS releases to date.
Everyone keeps making this comment but the article’s about the idea of a maintenance release more than about Snow Leopard. It was a good idea that’s stuck around in the dev community, something we’ve been talking about forever, so it’s basically a meme at this point to say “Snow Leopard release”
Open Launchpad and then try to click the spotlight icon in the menubar after. Even after returning to the desktop the system won’t allow me to click its own menubar. Reproducible on all machines, drives me insane.
I don’t use pretty much any of the features he listed.
But Sequoia has made some M1 Pros run poorly in my environment. It’s unacceptable the amount of resources it takes to do basic stuff that we got right of 30 years ago.
Snow Leopard was released while Steve Jobs was on medical leave. It was driven (as far as I can recall) by Bertrand Serlet. Rumour has it that Steve was furious about the "no new features" marketing when he returned from his medical leave.
This might be the main issue with software at Apple getting so much worse. Bertrand knew what he was doing. Apple's (and NeXT's) OS used to be an OS, not a collection of toy apps.
Sigh. I don't get the sentiment and the whole debate here. The author is clearly nitpicking (he is the first person who uses messages after all). But honestly, complaints about "arrange" screens button?
Nevertheless, he is probably right. Only the people who went through working on Windows, Linux both on cheap and expensive machines while dealing with all the "baggage" these environments bring can tolerate MacOS with leniency. I will never come back to anything else until I see a competitive offer from just anyone because what Apple offers is:
* Fast, silent, extremely energy efficient devices with excellent screens and audio.
* The font rendering. I honestly can't believe people who professionally work with text all their lives never mention it here. MacOS had and continues to have the best fonts and font rendering that is.
* Solid build that lasts (I own MacBook Pro and MS Surface Book 2 both from 2019 so I see how they age).
* A device that is ready to work when you open a lid or touch a keyboard button without any "waking up from sleep/hibernation" or freezing due to buggy video drivers and inability to work with GPU in hybrid mode OUT-OF-THE-BOX in 2025.
The above-mentioned is more than enough for me to tolerate any MacOS issues and the ones mentioned in the article are just laughable.
Apple offers you the full package that allows cross-device integration while Win/Linux users still rely on the Google stack or other third party "workarounds". Yes, no surprises here -- owning the hardware and software stack is a massive advantage.
> * The font rendering. I honestly can't believe people who professionally work with text all their lives never mention it here. MacOS had and continues to have the best fonts and font rendering that is.
Linux has significantly better font rendering than macOS these days if you're on a 140 or less PPI screen. Linux still does subpixel AA and text looks razor sharp, while Apple pretends very large monitors like my 140ppi 57" don't exist.
Power Mac G5 systems sold in 2006 were abandoned by Snow Leopard in 2009.
Apple could conceivably abandon intel Mac Pro systems sold in 2023 by releasing an Apple Silicon-only macOS in 2026, but three years still seems a bit aggressive.
BTW, there is an (earlier) example of Snow Leopard in the Microsoft ecosystem -- that would be Windows XP, which similarly avoided major new subsystems and new applications built-into-the-OS, but was remarkably fast and stable for its time.
I never tried the pre-SP XP, but even SP1 wasn't too terrible compared to what it was competing against (there was a lot of perfectly-usable 95/98/ME still around with their lack of privilege separation, and 2K was mostly better only if comparing at the same amount of RAM but XP machines were newer and often had more).
For me, service packs were largely a question of "do we want to tie up the phone line for however many hours just because Microsoft wants to rearrange a UI layout?"
Apple needs to restore primacy to the UI. MacOS and iOS used to feel non-blocking with a UI that would always respond regardless of how long a remote or long-running background task required.
Now iOS and MacOS feel sluggish and slothlike, waiting on IO, typically from a remote call. The webdevs have taken over.
Yes they need to remove cruft, and also re-hire the ruthless UI Nazis who would enforce 120hz responsiveness at all cost.
Can a publicly traded company be sued if they allocate more resources to QA? Could an activist investor argue that cleaning up Mac OSX is a waste of time because people will buy the computer anyway?
I long for a modern NeXTStep-like OS. A polished, consistent, solid operating system that is lean, clean and simply focused on getting things done. It should be predictable in every way and never get in your way. None of this SwiftUI bullshit, Animoji, AI or blurry UI. sigh
I'm enjoying Sorbet Leopard on my 20 year old Dual Core PowerPC tower. Mostly just messing around with old versions of Max making weird sounds.. but when I do interact with the OS it feels great and responsive and a joy to use. Modern MacOS can feel that way if you turn off a lot of crap. I don't even sync my accounts to the OS anymore.
The first thing I do when I reinstall macOS is to disable most of the ”features”, services, and apps Apple added over the last decade. I can’t imagine how cluttered my digital life would be if I’d depend on all those useless toys Apple stuffed into the OS and abandons a few years later (looking at you, Dashboard).
My initial wish for Apple was to make macOS as bulletproof, lightweight, and bug-free as possible. But now I just want to use Linux on my M1 MacBook because of all the bullshit that’s going on in the US right now. It’s only a matter of time until the Trump administration will start to dismantle the American technology sector, beginning with the softening of encryption and the death of Advanced Data Protection I currently rely on on iCloud. Mark my words.
Like I’ve said in a couple of comments before in other threads, I’d love to switch to Asahi but without native disk encryption I just can’t. If my laptop gets stolen, all my files would be visible to the thief, and that’s a risk I’m not willing to make.
The myth of Snow Leopard is strong (while in reality a lot of fundamental things people still complaint about weren't fixed), so Apple can just as well do nothing better and hope a new myth will emerge sticking to some other current name...
what are some good alternatives to mac os? there some features like image/text copy-paste being cross device that are insanely useful that make it hard to switch
There are three main choices and they are all compromised in their own way. You just need to figure out what is important to you and what isn't.
What you shouldn't do is take too much notice of posts like these, I've read through the whole thing and haven't had any of the issues mentioned. I've also not seen a mention of the issues I do have. HN has a negative tone, it seems we like to whinge.
I think we're at the point where it'd be better if OS's were just a thin platform, and the updates to user facing features came piecemeal to different apps instead. IE, update Finder or Safari but leave the core functionality alone outside of bug fixes or very rare major upgrades. I'm so sick of having to update my OS every year.
They neither have the financial and time capacity left required for high quality, nor do they have the engineers and management to enable it anymore.
„Just get it out somehow.“
„Fixing bugs is not a KPI for our promotions and salary increases.“
Old stuff is practically abandoned. No one knows how to fix it anymore and it’s replaced instead, at best. Disdain for legacy. The only thing management gets excited about is the next shiny thing, currently tacking AI onto everything.
Can you name big companies where this did not happen?
No AI please for the love of the spaghetti monster. I'm so sick of having this shit shoveled into anything I'm trying to do these days. Disabling Siri all these years was bad enough.
So far Apple has kept it as a toggle in the settings, but it's easy bloat for it to keep spreading. Does anyone need AI in a text editor? No.
While I appreciate the sentiment, I think the single best use case for LLMs today is drafting text, so a text editor sounds like home for an AI assistant.
Where I fall on this is. "What is the tool for?" I still default to nano in the terminal for basic editing. Him and eMacs are entire ecosystems when all I need is a chisel.
In the general sense, notepad and TextEdit should just be less nerdy nano's. They always have been and that's what they were meant for.
If you need something to write reports, a book, etc then you use MS Docs, Google Docs, or whatever Apple provides. Those are the tools where adding AI might be useful as a feature, like the ribbon in Office.
Yawn, there is some variant of this story after every os release.
The articles specific gripes with macOS are Mail, Messages, and System settings. Fixing those does not require a ‘no new features’ (which was always BS) major release.
I am totally invested in the Apple ecosystem, which on principle I'm against (closed systems never sat right with me), but at the time (beginning ~2015) the products and services were so well integrated and genuinely improved my life it was hard to see how things could ever get this bad. I'd still never (ever) go back to Windows, and Linux doesn't have the same feel or ease of setup as macOS, but I am genuinely, deeply concerned about this trajectory for Apple. Albeit super opinionated, but I feel that macOS was the saviour of modern aesthetic computing especially when Windows started its rapid decline post 7. I’m fine trading some frustration—like extra steps for untrusted software—if it keeps macOS secure and fast, free of Windows-level adware or telemetry. But right now, macOS has never been in a worse state.
I recently emailed Tim expressing the same concerns as the article and regarding specific issues with Messages and Mail resource usage and was surprised to get a response from Craig requesting more information and sysdiagnose files, but this is where feedback ended unfortunately.
The current state of the macOS UI is atrocious, devices don't all need the same button shape or menu UX flow across all devices as they are inherently interacted with differently. A Mac isn’t an iPad — why force the same rounded buttons and simplified menus on both? They’re interacted with differently: keyboard and mouse versus touch. I have no idea why this is so difficult for execs to understand or important for them to change. Software teams at Apple are so lucky to have the Apple Silicon innovation on the hardware side, Intel Macs would catch fire on boot-up running any of the latest releases given how atrocious the resource usage is.
While I'm here whinging, the iOS swipe keyboard is garbage (almost totally unusable now) where before it was perfect with the innovative predictive hit-box expansion pioneered by Ken Kocienda. I think that's now been replaced with AI prediction which in 2025 I don't understand why it can be so embarrassingly bad. I had to upgrade to the iPhone Max recently to hit the letters properly. Also Apple I never want to tell someone to "duck off".
Initially I was understanding, but quite frankly now I'm just pissed that it has gotten to this stage, and there is no indication of resolution from execs about these issues.
I’m starting to worry that Apple could go off the deep-end - the way of Microsoft - coasting on hardware sales while letting software quality slide (albeit seeming intentional from Microsoft's side of the fence). I get it — software isn’t where the money is, hardware drives the business - but the two are inseparable BY DESIGN. When macOS struggles with basic functionality, it undermines the value of the Mac itself.
Author calls out some truly irritating defects, and Messages is rife with them. But there are bigger ones in that application on both Mac OS and iOS.
Topping the shitlist has to be the inexplicable splitting of group threads for random people in the group, even when everyone is using an iPhone. Suddenly someone in the group gets the messages by him or herself and can't reply to the group. And this occasionally also happens in one-on-one threads: I've had years-old (maybe decades-old) threads suddenly split off into a new one with a friend of mine for no apparent reason.
There's some fundamental incompetence in Message's design, and I'm sure that the addition of RCS has made it worse because it was slapped onto a rotten core.
Oh yeah, then there's the way Messages (or, to be fair, iOS) loses all of your contacts' names if you travel outside the country. This is another brain-dead defect: Just because you're in a new country code, your iPhone suddenly can't associate U.S. numbers with your contacts. How the hell does this go unfixed for one major iOS revision, let alone 15+ years?
Oh yeah, then there's the way Calendar "helpfully" changes the times on your appointments when you travel... meaning that you'll miss all of them if you travel east, because your phone will move them hours later. I mean... who lives like that? I you're going to London on business and the next day you have a meeting at 10 a.m., your iPhone will "helpfully" change that meeting to, say, 5 p.m.
So when the author muses about whether Apple developers ever actually use this stuff in the real world, the only logical answer is no. Or they just take so little interest in the functional quality of their product that they just check in some grossly defective trash and call it a day... and refuse to fix it year after year.
Or... they're not given time and resources to fix it. I'm pretty gentle when filing bugs about Xcode, because I'm sure they are understaffed. But at this point, the neglect has (or should have) exhausted every developer's patience.
Which brings us to a bit of hypocrisy in the post: "Apple is clearly behind on the AI arms race"
NO. Apple's sad capitulation to armchair "industry observers" and "analysts" has contributed greatly to the very defects the guy complains about. Apple should not have jumped on the "AI" hype in the first place. It does not serve Apple's product line or market. They are not a search company or gatekeeper to huge swaths of the Internet. If they wanted to quietly improve Siri and get it RIGHT, fine. But now they're embarrassed, and resources that should have been spent on QA have been squandered on bullshit "AI" that failed.
Snow leopard was, as you said, necessary in anticipation of the architecture change.
Now there's no such change, but instead AI, this weird new cross-cutting but fuzzy function touching everything that no one has ever used reliably at the scale of Apple devices. AI is impossible to reliably test, and all-too-easy to get embarrassing results. I'm glad Apple recently tamped expectations.
The relatively loose concurrency model in Apple's ARM has made it rival the network in introducing new failure modes Many quality issues cited have their root causes in those two sources of indeterminacy.
Amplifying these are the organizational boundaries driving software flaws. Siri as a separate organization with its own network-dependent stack is just not viable for scattering AI. Boosting revenue with iCloud services makes all roads run through the servers in Rome, amplifying network and backend reliability issues. I also suspect outsourcing quality and the maintenance of legacy software has reduced the internal quality signal and cemented technical boundaries, as the delegates protect their work streams and play quality theater. The yearly on-schedule cadence makes things worse because they can always play for time and wait for the next train.
And frankly (to borrow a concept from Java land), Apple might be reaching peak complexity. With hundreds of apps sporting tens of settings, there is simply no way to have a fast-path to the few things different people need. Deep linking is a workaround, but it's up to the app or user to figure that out. (And it makes me livid: I can't count how many important calls I've missed by failing to turn off "Silence unknown callers", with the Phone app settings buried 3 layers deep ON MY PHONE)
A short-term solution I think is not a rewrite but concierge UI setup: come to the store, tell the "geniuses" exactly what you need, and make shortcuts + myUI or whatever is necessary to enable them to make it happen. Then automate that process with AI.
That's something they can deliver continuously. Their geniuses can drive feature-development, and it can be rolled out to stores weekly and -- heavens! -- rolled back just as quickly. Customers and employees get the excitement of seeing their feature in action.
The model of sensitive form-factor designers working in quiet respectful collaboration to produce new curves every year is just wrong for today's needs. All those people standing around at Apple stores should instead be spending an hour or more with each existing customer designing new features, and they should be rewarded for features that take, and especially for features that AI can incorporate.
On the development side, any one should be able to contribute to any new feature, and be rewarded for it. At least for this work, there would be no more silos, and no massive work streams creating moral hazards.
The goal is to make software and a software development process that scales and adapts. It may start at 5% of new UI features, but I hope it infects and challenges the entire organization and stack.
Granted, it will take a famously hub organization and turn it into a web of hubs, but that in itself may be necessary for Apple to build the next generation of managers.
Look for how today's challenges can help you build tomorrow's organizations.
Apple has gone from
Company I loved to the one I hate! They are the new Microsoft! They have hired a bunch of idiots in their security team who are driving their user base insane! They can completely lock you out of all your devices with no recourse! I am starting to move away from this pathetic company”s products!
There are some factual "gaps" there about how good Snow Leopard was, but I understand the sentiment. As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
It's just that me and other old-time switchers have stopped complaining about it and moved on (taoofmac.com, my blog, was started when I wrote a few very popular switcher guides, and even though I kept using the same domain name I see myself as a UNIX guy, not "just" a Mac user).
For me, Spotlight is no longer (anywhere) near as useful to find files (and sometimes forgets app and shortcut names it found perfectly fine 5 minutes ago), and there is no longer any way to effectively prioritize the results I want (apps, not internet garbage).
Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).
> Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).
I'm there as well. I've been really enjoying desktop Linux lately, but I can't go back to a non-Apple laptop at this point. There's just nothing else on the market that comes close, they all make some tradeoff I'm not willing to make - either screen, speakers, keyboard, heat/battery life/fan noise, touchpad, etc. Apple is the only one that has the entire package.
There's Asahi, but no thunderbolt yet and I'm not sure the future of that project with the lead burning out and quitting. I just want an Apple Silicon-esque laptop, no trade offs on components, that runs Linux, and there's no OEM out there that's offering that experience.
So, until that happens I'm staying on mac, and even with declining quality, it's not all that bad compared to the alternatives yet. I've learned to mostly work around/ignore the odd bugs.
Also while Apples software quality has definitely diminished over the years, Windows in the same period has utterly CRATERED. Like I get along fine with 11 for my gaming PC but with every single update one feature or another becomes notably broken.
My job gave me an expensive high-specced laptop with Windows on it. This is the first time I am stuck using Windows daily. It's W10. With Windows Defender and a bunch of windows, it starts to slowly become unusable. Today, it blue screened for me just fixing again (and again and again) the bluetooth headphones never gets automatically switched to when I turn them on. Forget about having Visual Studio open on it for an extended period of time.
Meanwhile, my 7-year old laptop with Fedora on it I type this is wonderfully snappy and stable. I started to get tempted to actually switch back to a Mac just to get some predictability and stability, but I have avoided macs for years. (And - never having to deal with constant line ending issues)
All I hear from other co-workers is how their perfectly specced laptops lag with Windows. It's freaking Stockholm Syndrome here!
Might be, still I no longer feel like baby sitting Linux on laptops as I used to, yes I had another go at it just last year I know how well it works, and I will never pay the Apple tax outside assigned work laptops.
Windows is indeed an execrable shitshow. Every aspect of it assaults the user with incompetence or outright hostility.
First is the endless badgering to log in, LOG IN, LOGGGG INNNNN with an asinine Microsoft account. If you can tolerate that and actually get the OS running, you're wading through a wonderland of UI regressions and defects.
The default hiding and disabling of options is infuriating. Try showing content from your Windows computer on a TV, for example. You plug your HDMI cable in, and you can select the TV as an external monitor in a reasonably logical manner. Great.
But wait... the sound is still coming from the laptop speakers. So you go to Sound in the system settings. Click on the drop-down for available devices. NOPE; the only device is the laptop speakers.
So you start hunting through "advanced settings" or some such BS. And buried in there you find the TV, detected all along, but DISABLED BY DEFAULT. WHY??? Not auto-selecting it for output is one thing, but why is it DISABLED and HIDDEN?
This is the kind of shit I have to talk my parents through over the phone so they can watch their PBS subscription on their TV. The sheer stupidity of today's Windows UI isn't just annoying, but it's demoralizing to everyday people who blame THEMSELVES for not being "computer-savvy" or slow learners. NO; it's Microsoft's monumental design incompetence and user-hostile behavior.
Microsoft doesn't get the relentless excoriation it deserves for its miserable user experience. There's no excuse for it.
I can't say I've ever had HDMI audio mystery-disabled when I try and use it, that's for sure a new one for me. That said the entire audio stack is an utter fucking nightmare. Selecting sound devices usually works, unless the game/software you're using either isn't set up to know about it, or isn't being told by Windows, either or. Then of course there's the fun game you play having two HDMI displays where Windows will constantly re-enable one you've disabled because it doesn't have any fucking speakers on it.
Win 11 must have some sort of contextual HDMI audio switching where it figures out exactly where you want your audio to go and then does the opposite. Because my Win 11 work laptop loves to re-enable HDMI audio and make it the active audio connection despite the fact that neither of my monitors have built in speakers.
I thought I was the only one having issues with HDMI sound, I usually unplug a few times before it switches the audio to the TV speakers.
Wow, it’s almost like Linux in the late 1990s/early 2000s now!
One word: Bazzite.
Fedora atomic distributions in general are great. I recommend Bluefin-dx over Bazzite (they’re both GNOME-based Fedora Atomic from the same group— universal blue) for developers, because it’s really easy to install the packages that Bazzite gives you, and it comes pre-installed with Docker.
I use Bluefin-dx as well, but pointed out Bazzite due to the mention of gaming. It's been rock solid for me for that use case.
Yep Bazzite is great. But the difference between them is mostly just the packages installed. To me it’s easier to install the gaming related packages from Bazzite onto Bluefin.
I have a problem with Docker sockets while installing onto Bazzite, and didn’t care enough to look further into it.
Is it comparable to gaming on Windows? Last time I tried the performance wasn't as good for some games (Deadlock) and it took ages to compile shader (it takes 30 seconds on Windows with the same specs)
I saw long shader compile times for at least one game last month, might have been Deadlock. I have a Radeon RX 7600 & Ryzen 9 7900X3D for reference.
There is mention on the arch wiki about enabling multi-threaded compiles, but also I have read you perhaps dont even need to precompile them now and possibly get better performance as the JIT compiles via a different vulkan framework (VK_EXT_graphics_pipeline_library).
I disabled pre-caching (which effects the compile too afaict) and never noticed any stuttering, possibly past some level of compute it's inconsequential. I also noticed that sometimes the launcher would say "pre-compiling" but actually be downloading the cache which was a lot slower on my internet.
Certainly on my (very) old intel system with a GTX1060, Sekiro would try to recompile shaders every launch, pegging my system at 99% and running for an hour. I just started skipping it and never really felt it was an issue, Sekiro still ran fine.
It’s comparable for nearly every game I’ve tried, and takes less than 30 seconds to compile Vulcan shaders on my rig.
That said, I think anything with kernel-level anti-cheat either does not run or runs poorly.
I also recommend Bluefin-DX. Been running it for about a year now and love it.
That’s not a word
Wiki tells me: https://en.wikipedia.org/wiki/Bazzite_Linux
More: https://en.wikipedia.org/wiki/Bazzite"That's" is two words, contracted.
I have some hope that Framework and AMD can fix some of those issues. Would love to try out their new desktop (because it's a simpler, more tightly integrated thing) and replace my Mac mini -- then wait for Linux power management to improve.
Linux power management is pretty good. The problem is that defaults favor desktop and server performance. On a MacBook Air 11, my custom Linux setup and Mac OS had the same battery autonomy, despite Safari being much more energy efficient.
The real problem is that, just like the grandparent post pointed out, Apple's software quality has been declining. The Tiger to Snow Leopard epoch was incredible. Apps were simple, skeumorphic, and robust.
Right now, the whole system feels a lot less coherent and robustness has declined. IMHO, there are not so many extra features worth adding. They should focus on making all software robust and secure. Robustness should come from better languages that are safe by construction. Apple can afford to invest on this due to their vertical integration.
https://en.wikipedia.org/wiki/Jerry_Pournelle#Pournelle's_ir...
The iron law of bureaucracy happens because humans have a finite amount of time to spend doing things. Those dedicated to bureaucratic politics spend their time doing that, so they excel at that, while those dedicated to doing the work have no time for bureaucratic politics.
It's related to why companies with great marketing and fund raising but mediocre or off-the-shelf technology often win over companies with deeper and better tech that's really innovative. Innovation and polishing takes work that subtracts from the time available for fund raising and marketing.
Great insight—thanks for sharing. It strikes me that bureaucracy is inherently self-perpetuating- once established, it rewards compliance over creativity, steadily shifting the culture until innovation becomes the exception rather than the rule.
Perhaps the real challenge isn't balancing innovation and marketing—it's creating a culture that genuinely rewards bold ideas and meaningful risk-taking.
> [Bureaucracy] rewards compliance over creativity
Imho, this is the wrong takeaway from parent's point.
Bureaucracy rewards many things that are actual work and take time. (Networking, politicking, min/max'ing OKRs)
Creativity and innovation are rarely part of the list, because by definition they're less tangible and riskier.
A couple effective methods I've seen to fight the overall trend are (a) instill a culture where people succeed but processes fail (if a risky bet fails then the process goes under the spotlight, not the person) and (b) tie rewards to results that are less min/maxable (10x vs +5%).
It seems most organizations naturally become more risk-averse as they age and grow since the business becomes more well-defined over time and there is more to lose from risky ventures. The culture has to reward meaningful risk-taking even when that risk-taking results in a loss, which can cause issues when people see the guy who lost a bunch of money getting a bonus for trying (not to mention the perverse incentives it may create).
What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?
Because right now it’s clearly so far down, beneath dozens of other priorities, that expecting it to just happen one day seems futile.
IMHO, Mac OS X contributed decisively towards making Apple cool, which was followed by lots of boutique apps and the success of iOS. Loosing that critical mass of developers, even if it's a tiny userbase, would worry me if I was a top leader of Apple.
Apple has had a contemptuous attitude towards developers since.. the App Store? when the iPhone was out? The last two decades? They don't seem to care about this.
App Store was a big improvement for developers when it was new, relative to the alternatives.
The things it does may not seem important today, but back then even just my bandwidth costs were a significant percentage of my shareware revenue.
ObjC with manual reference counting wasn't much fun either; while we can blame Apple for choosing ObjC in the first place, they definitely improved things.
Apple was incentivized to deliver a polished App Store DX when it first released, because it meant apps which meant iPhone sales.
Now that the platform is cemented, they don't have an incentive to cater to developers.
This is a ret-con. If you - as a user - were philosophically and inherently against the App Store, then it may seem that way, I guess?
The reality is that there was a long period of time where Apple built up lots of goodwill with a developer ecosystem that exceeded by many orders of magnitude the pre-iPhone OS X indie Mac developer scene.
There were many, many developers that hadn’t even touched a Mac before the iPhone came out, and were happy with Apple, and now are certainly not.
>This is a ret-con...
Another way to see it is that people who programmed for Mac OS already had reasons to be annoyed by Apple (e.g. 64bit Carbon). The iPhone let it get new people, who eventually found out why the pre-iPhone scene felt that way.
And that’s a huge part of the reason why the Vision Pro will never take off.
I disagree - if the Vision Pro had some strong use-cases then developers would hold their nose and make apps for it. The platforms that get apps are the ones where businesses see value in delivering for them. Of course businesses prefer it when making apps is easier (read: cheaper) but this is not a primary driver.
I think the potential high-return use-cases for VR and AR are (1) games, (2) telepresence robot control, (3) smart assistants that label (a) people and (b) stuff in front of you.
Unfortunately:
1) AVP is about 10x too pricy for games.
2) It's not clear if it can beat even the cheapest headsets for anything important for telepresence (higher resolution isn't always important, but can be sometimes).
Irregardless, you need the associated telepresence robot, and despite the obvious name, the closest Apple gets to iRobot is if someone bought a vaccum cleaner because Apple doesn't even have the trademark.
3) (a) is creepy, and modern AI assistants are the SOTA for (b) and yet still only "neat" rather than actually achieving the AR vision since at least Microsoft's Hololens, and because AI assistants are free apps on your phone, they can't justify a €4k headset — someone would need a fantastic proprieraty AI breakthrough to justify it.
They stopped caring about developers when they dropped the price of the developer program and no longer gave you a T-shirt for being one.
> What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?
Unrelenting bad press. People talking about nothing else but the decline of their software quality. We can already see that with the recent debacle which caused executive shuffling at the top of the company.
That shuffling was caused by Apple utterly failing to deliver a major feature, that was a key selling point for the latest generation of their hardware.
"Bad press" for their declining software quality is like people complaining there's no iPhone mini/SE anymore. Apple just doesn't give a fuck. They've joined the rest of the flock at chasing fads and quarterly bottom lines.
What was the major feature? The complete uselessness of “AI” on macOS? I updated and enabled all the AI features and I would ask Siri from my M1 and it failed every time. Would just continuously try with its annoying ping sound and never work. Blew my mind that they let this out.
Yeah I was talking about the "AI". It's such an utter failure that even Gruber has been calling it out.
It was already the same story with AirPower (the wireless charging mat). They've pre-announced it, even tried to upsell it by advertising it on the AirPods packaging. It just turned out physics is ruthless.
TBH I've been increasingly sceptical about voice assistants in the "pre-AI" era. I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.
A few months ago, for quite a few years, Siri (in the car) would respond correctly to "Play playlist <playlist name>". Now it interprets that as of about two months ago that it should play some songs of the genre (I have a playlist named "modern").
No idea what changed, but it sucks.
> I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.
I have almost the opposite problem this year. I tell the HomePod to turn the office lights on, it sometimes interprets this as a request to play music even though my library is actually empty, and the response is therefore to tell me that rather than turn on the lights.
Back in the pandemic, same problem with Alexa. Except it was in the kichen, so it said (the German equivalent of) "I can't find 'Kitchen' in your Spotify playlist" even though we didn't even have Spotify.
The decreasing effectiveness of machine-local search is just developers fucking up integrations and indexing.
This is a solved problem since ~1970 -- they're just not spending enough time on it.
I think the best argument is to remind Apple that they aren't selling the OS anymore, so they don't need a new version every year. And that macOS features is not what is pushing Mac sales. People aren't buying the M series machines because of the new macOS version, they are buying it because of the hardware. The M series chips are impressive and provide some great benefits that you can't get elsewhere.
And that hardware needs to be coupled with solid software to hook and keep people on this computer. So they can take more time to create more compelling upgrades and sand off more edges.
I think they need to desync all their OS's and focus on providing better releases. There really is no benefit to spending the day updating your Mac, phone, tablet, appletv, and HomePod. Especially when there are no good reasons to update. I feel like Apple became far to addicting to habit and routine that it's become more important to keep that than deliver product. Apple Intelligence is a good example of that.
It's all their OS software. The Messages app on 18.3 will just... not open the menu to send a photo attachment about ~10% of the time now...
I’m pretty sure the touch target only covers the text label. Tap anywhere other than the text labels and it does nothing but close the menu. Really bizarre.
Ya and it’s something in maybe the top 3 of most used user actions. Really indefensible
Apple is addicted to growth. It is as big as it should be, but it acts like an early stage startup always trying to build some new flashy thing to attract the next customer.
It's not Apple, it's capitalism. "Unlimited growth is the ideology of the cancer cell", yet for Apple (or any corporation), it's not good enough to sell 100,000,000 phones. Next year you must sell 105,000,000. And the year after 112,000,000 (not even 110 or your growth is stagnating).
So you get rid of removable batteries so customers have to toss their phones away more often, you gimp other feature, you spend more money on advertising than you did actually developing the product (read this bit several times until it sinks in how crazy it is, yet that's how we are with every major phone, every major movie, etc), and so on.
In 2016 RedLetterMedia did a breakdown of the movies that year, like top and bottom ten grossing movies. They stated that the advertising budget was the same as the production budget, unless they had knowledge of a different number.
I don't doubt that after 2020 the advertising budgets far outstripped the production budgets - multiple times; I am curious if that trend continues now, now that production isn't hamstrung by covid restrictions.
I'm sure everyone has seen this 100 times already but it really fits given modern advertising practice of every major company, especially in designing products to fit advertising plans.
There are also entire "industries" designed to shield people who want to find quality content from big 'A' advertising.
https://www.youtube.com/watch?v=tGKsbt5wii0 For context John Sculley said "Apple was the marketing company of the decade" in the 80s and Kicked Jobs out of Apple
I love how he uses the word “craftsmanship”, something that he understood quite well (considering how close he was working with people like Bill Atkinson, Andy Hertzfeld, Burrell Smith, etc).
Today engineers have to put up a fight to do anything resembling craftsmanship.
Do you want to retire?
Capitalism works this way because its customers, the investors, want it to work this way, because growth is how you get compound interest. Investors include anyone with an interest bearing bank deposit, a 401k, stocks, bonds, etc.
No growth means it would no longer be possible for an investment to appreciate.
I think of a similar thing when I see people complaining about how companies don't want to pay good wages. When you go shopping do you buy the $10 product or the $5 essentially equivalent alternative? Most people will buy the $5 one. If you do that, you're putting downward pressure on wages.
It's in your (purely economic) best interest for your wages to be high but everyone else's to be low. That's because when you're a worker you are a seller of labor, while when you're a customer you are an (indirect) buyer of labor.
Everything in economics is like this. Everything is a paradox. Everything is a feedback loop. Every transaction has two parties, and in some cases you are both parties depending on what "hat" you are wearing at the moment.
Growth isn’t necessary for high returns on equity. And it isn’t necessary for the investment to provide a return.
Equity returns ultimately come from risk premiums. (Which are small now in US equities BTW).
I’m invested in a microcap private equity fund that has returned >20-25% for years. They have high returns because they buy firms at 3-4x cashflow. You will get the high returns even with no growth. And with no increase in valuation. The returns are a function of an illiquidity premium.
With Apple explicitly, growth is expected given the valuation level. If it doesn’t grow, the share price will decline. So yes, in their case, firm is certainly under pressure to grow.
I also don’t agree with your “best interest for wages to be high and everyone else’s lower”. That is one aspect. It is more complicated. Consider Baumol Effect for starters.
I'm talking about macroeconomics, not micro. Risk premium means there is risk; not everyone gets a return at all. The entire society, as a whole, cannot experience consistent returns unless there is macroeconomic growth. If the pie is not getting bigger, someone has to be losing for someone else to gain.
Things like retirement, 401ks, etc., are society-wide institutions subject to macroeconomic rules.
I buy the $10 one because the margin has to come from somewhere. 9/10, the more expensive product is better.
Okay, if you are so confident in your convictions, convince enough Apple shareholders of this.
Why should I? That's not my job.
b- was probably hinting that the confidence of your conviction may be unsupportable?
It's also not my job to prove my conviction to aggressive internet people.
Regardless: that kind of message doesn't feel like HN-worthy productive discussion.
The actual argument would be people voting with their wallets and moving away from the Apple ecosystem, but this something impossible at least in the USA due to these bullshit "blue bubbles"
How do blue bubbles make any difference?
For most of the people here they don't. In popular culture and especially among teens and non-technical twenty-somethings there's this absurd "eww green text!" thing. A blue bubble is a status symbol for some reason, even though there's lots of Android phones that cost as much as iPhones.
At this point this is not an argument anymore, it’s just a thought terminating cliche.
Expecting users to change their daily habits in order to marginally improve the operating system of a trillion dollar company feels naive and a bit disrespectful to people who actually use these machines for work.
Even developers… the vast majority of developers ignored Apple for decades (and Apple was also hostile) and it managed to grow despite that.
Might as well ask people to contribute to Gnome or whatever so in the future everyone can go somewhere better. Feels way more feasible.
But the opposite is assuming that Apple has a "responsibility" towards its existing users and has to acknowledge their expectations from them.
A sentiment which famously led Steve Jobs to respond that he doesn't understand this, because "people pay us to make that decision for them" and "If people like our products they will buy them; if they don't, they won't" [0]
So according to Steve Jobs himself, the only Apple-acknowledged way to disagree with Apple is to NOT buy their products, and by extend into the services-world of today it means STOP USING their products.
Now Steve Jobs doesn't officially run this company anymore, but I don't see any indication that this philosophy has changed in any way.
[0]: https://www.youtube.com/watch?v=i5f8bqYYwps&t=772s
I don't think that's the opposite. The opposite is admitting that people have more than one reason to choose computers, and "voting with your wallet" only works for easily replaceable items, like groceries, clothing, etc.
Most people are not going to migrate to Android, Windows, Linux or whatever else just to make macOS marginally better.
And it's fine: marginal quality improvements of a product are not the "responsibility" of consumers.
This is absurd. You quite clearly don’t experience “blue bubble” envy yourself, because you’ve so obviously corrupted the sentiment, and argument.
Nobody is saying “gosh, macOS is so damned unstable, but I’ve gotta use it, because…blue bubbles on my iPhone?
You’ve just read some story about a company you already hate and are parroting it.
I don't think you're taking their argument in good faith. At least my read on what's being said here is that the psyops lock-in effects that Apple uses are too strong.
It's not just "blue bubbles," but "blue bubbles" seems like a good shorthand to me. It's also things like Hand-off, or Universal Control, or getting Messages on both iPhone and Mac seamlessly, or being on the same WiFi network allowing your iPhone/Watch to work as a tv remote for the Apple TV even if you're just visiting a friend. Features that any platform can and does enable, but that do to Apples vertical can work seamlessly out of the box, across all the product lines, while securing network access in the ways most users will want, creating a continuous buy-in loop wherein the more Apple products you buy, the more incentive there is to buy exclusively Apple.
And it's a collective "you." If your entire family uses exclusively Apple products, then you'll be the only person who can't easily use eg the Apple TV in the living room, or the person "messing up" the group chats with "User reacted with Emoji Heart to [3 paragraph text message]," or the one trying to decide between competing network KVM software platforms so that you can use your tablet when your 12-yo can just set their tablet next to their laptop and get a second screen without any setup. Nevermind that these are all social engineering techniques that only exist BECAUSE Apple chose not to play nice with others, they still socially reinforce a deeper commitment to Apple products with each additional Apple product in the ecosystem.
I say this as someone "stuck in the blue bubble" with eyes open about what's going on. I'll keep picking Apple as long as they're a hardware-oriented company, because their incentives are best aligned with mine for the consumer features they are delivering (for now): consumer integration that sells hardware. It's insidious in its own way, but not like "hardware that sells eyeballs" (Google/Meta) or "business integration that sells compliance" (Microsoft).
Probably nothing as it seems the major push is to get iPad OS and macOS in parity (and I assume to retire macOS completely)
> What’s the actual argument that will credibly convince the top leaders of Apple, to push fixing MacOS up the list of priorities?
That their own products depend on it because they developer their products in Mac. And that the professional people they pretend they cater to depend on Macs, and steadily move away.
> Robustness should come from better languages that are safe by construction.
Nahh, robustness comes from the time you can spend refining the product not from some magic property of a language. That can help but just a bit. There was no Swift in Snow Leopard. Nor there is not much Rust in Linux (often none) and even less (none) in one of the most stable OS available, FreeBSD.
They should just release a new version when the product is ready and not when the marketing says to release it.
> Linux power management is pretty good
> defaults favor desktop and server performance
Desktops are in S3 half the day consuming ~0 power. During use, electricity costs are so much lower than hardware costs that approximately nobody cares about or even measures the former. Servers have background tasks running at idle priority all day so the power consumption is effectively constant. Laptop and phone are the only platforms where the concept of "Linux power management" makes any sense.
My Mac mini (M1) sips ~6W idle and is completely inaudible. It acts as a desktop whenever I need it to, and as a server 24/7. I only power up my NAS (WoL) for daily backups. The rest of the homelab is for fun and experiments, but mostly gone.
"Idle" x86-64 SOHO servers still eat ~30W with carefully selected parts and when correctly tuned, >60W if you just put together random junk. "Cloud" works because of economies of scale. If there's a future where people own their stuff, minimising power draw is a key step.
The low power draw is definitely not exclusive to Macs, a similar x86 mini PC with Linux will also draw around 5W idle.
Does the mini PC go from zero to eleven though? Can I play BG3, Factorio, or Minecraft on the same hardware? Can I saturate a TB3 port? Transcode video? Run an LLM or text2img? Any of that while remaining responsive, having a video call?
If I already need a powerful machine for a desktop, why would I need a second one just so it can stay up 24/7 to run Miniflux or Syncthing? Less is more.
Yes; for about $1000. Eg:
https://www.bee-link.com/products/beelink-ser9-ai-9-hx-370
I have the ser-8 model, and can confirm everything works under Linux. This one has an 80 TOPS AI thing, since you asked about llms.
Huh, pretty cool. Would you mind submitting your scores to Geekbench? Can you also test idle power? Genuinely interested.
Currently also looking at Framework+AMD.
I want Mac hardware but Linux software. The other makers build quality is horrendous. Especially in the 13inch segment which is my favorite. Using a pretty old laptop because there is no replacement right now.
The new Ryzen AI looks really interesting! Sadly there is no Framework shop for me to look at it and they not ship to Japan..
Thinkpad line from Lenovo. Amazing build quality, and you can order them with Linux.
I have a P1 Gen 7 and it’s fantastic. It feels premium, and it’s thin, light, powerful, has good connectivity and 4K OLED touch screen. I’d take it over Mac hardware any day.
Aren't the only Thinkpads with displays in the 4k neighborhood 16-inches? The 14-inch Macbooks are 3024*1964 and have all been like that for a while. I don't know why the PC world (and Linux ready by extension) undervalues high DPI so much, because it makes it hard to consider going back.
The screen keeps me on macbooks as well (well, and the touchpad, the speakers, and the lack of fan noise).
But it is baffling how 1920x1080 (or 1200p) are still the "standard" elsewhere. If I want an X1 carbon, the best screen you can get at 14" right now is 2880x1800 (2.8k). Spec it with 32GB of RAM and it's clocking in at $2700, for a laptop that still has a worse trackpad, worse sound, and worse screen than a 14" MBP at $2399. And the Ultra7 in the thinkpad still doesn't beat the Mac, and it'll be loud with worse battery life.
There truly is nothing else out there with the same experience as an Apple Silicon MBP or Air.
So, my only options for the foreseeable future is wait for Asahi Linux, or suck it up and deal with macOS because at this rate I don't think there will ever be a laptop with the same quality (across all components) of the mac that can run Linux. The only one that came remotely close is the Surface Laptop 7 with the Snapdragon elite, but no Linux on that.
Non-Thinkpad Lenovos have some standouts too. I'm running Debian Stable on an AMD Yoga Slim 7 from a couple of years ago and sure, it's not an Apple, but for the £800 or so I paid for it, it's a really polished machine. Loads of ports, and it's approximately performance-competitive with a Dell XPS13 from about the same time that cost literally twice as much.
The one snag I ran into was that when it was new, supporting the power modes properly needed a mainline kernel rather than the distro default. But in the grand scheme of things that's relatively trivial.
I have an M1 Macbook Pro from work and honestly I'm not tempted to get one for myself. I am tempted by the M3 and M4 beasts as AI machines, but as form factors go I'm just not sold.
The biggest issue Framework have right now is shipping. I can order a ThinkPad practically anywhere. No so with Framework - they are literally leaving money on the table from what I would assume their core segment: affluent tech savvy users trying to get off the planned obsolescence cycle.
I'm not sure I follow. Your complaint is that Framework only sells direct and not through retailers?
No, there is a lot of us who live in countries that framework doesn't ship to.
And if you use a mail forwarder, they deny your warranty.
ditto (insert sad puppy face here)
Tell me you're from the US without telling me you're from the US.
Jokes aside, I had to wait years for Framework to finally allow shipping via a friend in Berlin. I think they ship to Sweden now—they seemed to have an unfortunate misunderstanding that they needed to produce a Swedish keyboard and translate their website before shipping here, which of course is poppycocks.
I am pretty sure that if you have reached the point that you are ordering a laptop online from a brand unknown to the general public, it means you are past the point you need the actual physical keys to match your keyboard layout on your OS settings. You could just have blank keys.
> The other makers build quality is horrendous
Out of curiosity, what are you basing this on? From having spoken to people who manage IT fleets, and being the person regular people ask for advice for what device to get, with the occasional exception (which Apple also had plenty of, cf. the butterfly keyboard), you get what you pay for. A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.
The cheapest $500 Acer won't.
> A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.
And it still won't be on par with a $999 apple silicon air, or a MBP.
I've deployed latitudes, precisions, and thinkpads. They all still make tradeoffs that you don't have to deal with on the mac.
The X1 carbon is probably the "best" but, even with that - you are still getting a 1920x1200 screen unless you spend more than a MBP for the 2.8k display (which is still less than the 14" MBP, and costs more than an equivalent specced M4 pro). The trackpad is worse, the speakers are worse, battery life is worse, and they're loud under load.
They're all fine for a fleet where the end user isn't the purchaser, which is why they exist, but for an individual that doesn't want tradeoffs (outside of the tradeoff of having to use macOS), there's no other option on the market that comes remotely close to the mac. For someone that wants Apple silicon MBP level hardware but wants to run Linux, there are zero options.
The screen is the most egregious tradeoff though, the PC world is still adverse to HiDPI displays and even on high end models 1080p or 1200p is still the standard. I can excuse poor speakers, it is a laptop after all, if I really had to I can deal with fan noise, but I shouldn't have to spend more than a MBP to get a decent 120hz HiDPI screen with sufficient brightness and color accuracy.
Fully agree. Asahi is pretty good now though, and the list of missing features continues to shrink.
At work, our Windows devs use expensive XPSs that are complete crap failing constantly, both hardware and software. As someone who used Latitudes and Precisions when these were the reliable workhorses you seem to describe, the new stuff is just outrageous. (My personal laptop is still an e6440).
My work machine is an M2 Pro MBP and except the shitty input HW (compared to the golden era of Thinkpads/Latitudes without chiclet keyboards) and MacOS being quite bad compared to Linux, it completely trounces the neighbouring Dells that constantly need repairs (mostly the USB-C ports and wireless cards failing).
Maybe if you run a fleet that's statistically true. If you're a regular person you can have incredible bad luck with specific models.
Got two "2k" Lenovos at 4 year intervals.
The first one worked fine but that model was known to have a weak hinge. Had to replace it three times.
The second one had a known problem that some units simply stop working with the internal display and the only solution is replacing the motherboard. My unit worked about a week for me. Seller refunded me instead of repairing because it was end of the line and they didn't have replacements.
Got a "2k" Asus ordered now, let's see how that goes :)
Compared to that, even the one emoji keyboard macbook pro that i had worked for years. The keyboard on those models is defective by design and kept degrading, and I still think Cook should take his dried frog pills more regularly, but the rest of the laptop is still working. Not to mention my other, older apple laptops that are still just fine(tm), just obsolete.
I think price isn't the only thing. PC gaming/consumer laptops lean pretty heavily on price to performance ratios and I think they cut build quality to do it. Business lines like Thinkpad/EliteBook tend to offer worse performance dollar for dollar but they are built better.
Yeah, I happen to need a laptop for that niche between gaming laptop and high end workstation.
Where's a Thinkpad that can run Maya comfortably for a student? AFAIK they only have models with Quadros that have anything but student prices.
So I'm stuck with "gaming" models.
Besides my daughter likes the bling :) If only they could sell me something that doesn't die in a week...
Consider a thinkpad or lenovog yoga pro. I don't think the difference is is that pronounced anymore, maybe it never was, but you always need to look at the premium segment. Somehow people end up comparing budget pc laptops and macbooks.
Asahi?
Yeah, I heard good things about it. I do a lot of gamey development stuff and x64 makes that easier. But Asahi seems to be catching up a lot recently, maybe I should look at it again! https://news.ycombinator.com/item?id=41799068
Asahi is an adventure. I am in the same camp where I got a MacBook for the hardware, but am really a Linux guy. I got really excited when the fex/muvm patches came out for Asahi, and switched to mainly booting it for a couple months. 80% of what I needed to do worked, but that 20% still wasn't there. It was mainly the little things too:
1. Display output from USB-C didn't work 2. Couldn't run Zotero 3. Couldn't compile Java bioinformatics tools 4. Container architecture mismatches led to catastrophic and hard-to-diagnose bugs
There were things that worked better, too (better task management apps, and working gamepad support come to mind). Overall, even though I only needed those things once or twice a week, the blockers added up and I erased my Asahi partition in the end.
I really appreciate the strides the Asahi project has made (no really, it's tremendous!), and while I would love to say that Linux lets me be most productive, features like Rosetta2 are really integrated that much better into MacOS so that I can't help but feel that Asahi is getting the worst of both worlds right now. I'll probably try again this summer and see what has developed.
What dou you mean with more integrated? It is a regular desktop PC with an apu (like is totally common for office PCs, just bigger) and soldered instead of upgradeable ram.
It would be kind of funny, but also very sad, if Apple guys mistook the copying of apple's worst behaviour - producing throwaway devices - as a sign of quality. Though I think we are there for years now with phones, I wouldn't expect such thinking here.
It is fully designed around the limitations of that particular APU and makes the best of it, without being a generic motherboard.
It is "integrated" in the way that the processor is an APU that has specific memory bus requirements. That's all. It is not an integrated software-hardware system that is finetuned, and that board is not any better than a a generic motherboard would be for a regular processor.
My point is that this system is not integrated in the way apple fans usually define the word. I'd claim it is not integrated at all. It is a regular PC (but with soldered ram), which is exactly like framework announced it.
There should be no need to sprinkle some apple marketing bs on that to make it attractive.
I really wish everyone would stop entertaining these borderline crackpot hypotheticals that all rely on the notion of “those damn Apple dummies not getting it!”
It’s absurd.
Thanks, what a nice characterization.
As someone who actually studied human computer interaction, and since I had to work with borderline unuseable macs multiple times in my career now, plus as someone seeing the utter failure of relatives in just using an iPhone (bought since "it is so much easier", now not even able to call from the car system since it is so buggy), the Apple popularity is absolutely a case where you have to look at external factors like social status. And if that translates to "the users are dummies" to you, then that's your interpretation. Plus yes, translating marketing/status concepts like a bogus "integrated" status absolutely is interesting, thus my intent to clarify whether that is really happening here (plus some criticism, admittedly).
Probably not worth it going further into this though, it will only derail.
As a former Mac user, I'm really happy with my System76 linux laptops. The only tradeoff is the terrible built-in speakers. My Lemur is lighter and has better battery life than my Macbook Air and has been bulletproof despite my ill treatment. Each of my Macs, however, have had various hardware failures or the famous keyboard recall on the horrible touchbar Macbook Pro. I also prefer matte screens to glossy, so that's a win for me, but ymmv.
The screen quality is why I didn't get a system76 laptop the last time I did a refresh a couple years ago.
I have found this old comment:
* Battery life is a lie, especially since it drains almost as much battery closed as it does open.
...
Overall, I think I am probably going to switch back to a macbook after this, not being able to go a day without charging and your laptop always being on low battery is a bit anxiety inducing.
https://news.ycombinator.com/item?id=38206173
They must have a bug, because my System76 laptops drain way less with the lid closed than my Macbooks.
This really is exactly how I feel. There are too many tradeoffs to switch to non-Apple hardware at this point. I'd love to run Linux/BSD full-time, as many of the apps that I frequently use on my Mac are FOSS (e.g., R, PyCharm, darktable, etc.) I've been a Mac user since 2002, and Mac OS X served as my gateway to the Linux/BSD world (that, and a short-lived use of RH 6.2 on an old Dell laptop). IMO, macOS really does need a Tiger/Snow Leopard-esque release, but I'm not sure the vast majority of macOS users would even appreciate such a release.
The newer XPS 13 comes with snapdragon x elite now (Qualcomm's answer to Apple silicon). Curious if anybody here runs Linux on one of those
I feel like this is the first real step towards a Mac like experience on a Linux system.
That is highly unlikely to happen in the near future (say 2 years).
It's still waiting for good linux support.
Not quite what you're after but if you want a fanless option that runs full linux and doesn't use much battery, the new argon 40 CM5 laptop that's being built looks like it could be viable as long as you'd be happy with that much of a drop in performance and a few pi based niggles (No USB C video, only one pcie lane for the SSD, etc.)
https://liliputing.com/argon40-is-making-a-raspberry-pi-cm5-...
Your loss. I haven't been able to tolerate the MacOS experience since Catalina, running GNOME with a Magic Trackpad has felt head-and-shoulders better for the past 3 years at least. Apple Silicon is neat but was never an option for native development in my workflows anyways. The software matters more to me, and MacOS has been sliding down the subscription slopware slope for years now.
I am perfectly happy to use last-gen hardware from Ebay if it runs an OS that isn't begging me to pay for subscriptions and "developer fees" annually. My dignity as a human is well worth it.
The reason that keeps me on Windows, is that you left out of your list gaming and 3D graphics on laptops.
Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming, the M chips aren't up to NVidia ecosystem, SYCL, the two major compute APIs for any kind of relevant GPGPU workloads, and thus don't really matter.
And gaming, well, even though all major engines support Metal, there is a reason DirectX porting kit is now a thing.
So why pay more for a lesser experience, and then there is the whole issue macOS doesn't have native support for containers, like Windows does (their own ones), and WSL is better integrated and easier to use than Virtualization Framework.
Gaming is pretty great on Linux now. I just finished a little Elden ring session and it still blows my mind that when I close the game my Linux desktop is there behind it. No more dual booting, hopefully will never need windows for anything ever again.
You mean translating Windows and DirectX APIs is great, there is hardly a Linux gaming ecosystem.
Proton is the acknowledgment of Valve's failure to entice game studios, already targeting Vulkan/OpenGL ES/OpenSL on Android NDK, Switch (which has OpenGL 4.6/Vulkan support), or on PlayStation (Orbis OS being a FreeBSD fork) to target GNU/Linux.
I rather have the real deal, not translations.
As an actual dyed in the wool game developer.
There’s no such thing as “native”, all the things you’re talking about are translation layers for hardware instructions themselves, and the overhead for doing software based translation is significantly less than hardware accelerated virtual machines- and we as an industry love those.
The reason for this is because the translations are very cache friendly and happen in userland, so the performance impact is negligible, and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows.. Which is crazy when you consider the difference in quality of the GPU drivers.
I understand that you want it to “just work”, but that tends to be the experience anyway.
You can do what you want, it’s your life, but this is not a terribly good excuse. Valves “failure” is essentially rectified.
I will add though, that it’s actually Stadia that made linux gaming the most feasible, many game engines (all of the ones I worked on) were ported to Linux to run Stadia, those ports changed essential elements of the engine that would have been slow or difficult to translate; so when Proton came around quite a lot of heavy lifting had gone away. I only say this because Valve gets some credit for a lot of work our Engine programmers did to make Linux viable.
> and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows..
I play most of my games in a window and switch away a lot. A million years ago when I was still playing world of warcraft, the system overall was much more responsive on the same hardware with wow on wine on linux than with wow natively running on windows :)
> it’s actually Stadia that made linux gaming the most feasible
Stadia was the most predatory gaming offering aside from IAP games, sorry. Buy your games again on top of the subscription? Lose them when Google cancels the service? No thanks.
Nvidia's GeForce Now was a lot more honest. Pay for the GPU and streaming, access your owned games from Steam. I'm not using it any more so I don't know how honest they still are, but I did for like a year and it was fine(tm).
The fact that Stadia advanced wine compatibility is great, but technical reasons aren't the only reasons that make a service useful to your customers.
OP is talking about Google (Stadia) throwing money at the problem and incentivizing game engine companies to better support Linux. They’re not talking about pro or anticonsumer the tech was.
I know and even agree with them. I'm also surprised that Stadia was useful for something...
So how many of those ported game engines are actually making a different on GNU/Linux gaming today?
There is certainly such thing as native, one thing is the platform where the APIs were originally designed for and battled tested, and the other is other platform emulating / translating them, by reverse engineering their behaviours with various degrees of success.
Valve's luck is that so far Microsoft/XBox Gaming has decided to close an eye on Proton, and it will run out when Microsoft decides it has gone long enough.
> So how many of those ported game engines are actually making a different on GNU/Linux gaming today?
Not sure, Unreal Engine is pretty popular though and Snowdrop is increasingly common for Ubisoft titles.
https://www.protondb.com/app/2842040 https://www.protondb.com/app/2840770/ https://www.protondb.com/app/365590
> https://www.protondb.com/app/2842040
Star Wars Outlaws
Natively Supports: Windows only
> https://www.protondb.com/app/2840770/
Avatar: Frontiers of Pandora
Natively Supports: Windows only
> https://www.protondb.com/app/365590
Tom Clancy’s The Division
Natively Supports: Windows only
----
You were saying?
I know, I worked on those games.
Specifically, I worked on those games so I know what they natively support and how things transpired behind the scenes.
Proton has absolutely no hope of working without the changes we made because of stadia, the code we wrote was deeply hooked into Windows and we made more generic variants of many things.
The Division 1 PS4 release was significantly shimmed underneath compared to the win32 and xbox releases: this became much less true over time as porting the renderer to linux (specifically debian) made us genericise issues across the OS’s and when Div2 shipped we had a lot more in common across the releases; we didn’t rely on deep hooks into Microsoft APIs as much
> this became much less true over time as porting the renderer to linux (specifically debian)
Strange how you ported the renderer to Debian, and yet you couldn't even find a link to a game that has a native Linux support.
Was there ever a port?
> Proton has absolutely no hope of working without the changes
You keep saying this as the absolute truth, and yet at the time when Stadia launched Proton already had 5k working games under its belt.
Strange how Stadia is this monumental achievement without which Linux gaming wouldn't happen according to you.... and yet no one ever mentions Stadia ever contributing any code to any of the constituent parts of what makes Proton tick. Apart from the changes that engines supposedly made to work on a yet another game streaming platform.
I don’t know how to say this without being unkind.
There is a functioning version of The Division 1, Division 2, Avatar and Star Wars outlaws that run on Linux internally at Ubisoft.
Nobody will release it because it can’t be reasonably QA’d. (Stadia was also very hard to QA, but possible, as it was a stable target and development was essentially funded).
I’m not sure what your problem is; I said - as clearly as I can - that architectural changes to the engine were neccessary for proton.
I know this, for an absolute fact, because Proton was a topic when I worked on those games and it was not until Stadia (codename Yeti) was on the roadmap, and our rendering architect lost all his hair working on it - that Proton started to even function slightly.
I’m not shilling for Stadia - there’s nothing to shill for, it is dead.
Get over yourself, if you don’t like the truth then don’t start going in on me because my reality does not match your fantasy. Sometimes corporations do things accidentally that push other things forward unintentionally.
I just want to share my thanks to Stadia because I know for a concrete fucking fact that at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.
> I’m not sure what your problem is
All I'm saying is that "it’s actually Stadia that made linux gaming the most feasible" statement is at best contentious because in reality gaming on Linux was already made (more) feasible when Stadia had only just launched.
And Stadia used the same tech without ever giving back to Proton at all (atl least nothing I can quickly discover). So the absolute vast majority of work on Proton was done by Valve which you dismissed as "when Proton came around" (it came around before Stadia) and "quite a lot of heavy lifting had gone away" (Valve did most of the heavy lifting).
That's the extent of my "problem".
> at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.
So, not "actually Stadia that made gaming feasible on Linux" but "because Stadia used all the same tech, and there were possible commercial incentives early on until Google completely dropped the ball, bigger studios also invested in compatibility with the tech stack"
You’ve taken a weird position here.
Stadia did a lot to help by being a stable target and by being seen as commercially viable. Google also helped a lot to aid developers, not just financially.
That they didn’t contribute code to proton doesn’t factor at all, I just hate to see people not get their dues for their part in the prolification of Linux gaming- because I saw it first hand.
You are labouring under the delusion that I’ve implied Proton did nothing, no, they levied a lot of existing technology and put in a lot of polish. They were helped by Stadia, by Wine, by DXVK and others.
They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.
Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.
That proton was running some games is a weird revisionist take, very few AAA games ran at all, those that did were super old and there was always some crazy weird bugs- proton got better but also AAA games coalesced into conforming to linux-y paradigms underneath better- so support got better much quicker than expected. You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.
Hope that helps, because honestly this conversation is like talking to a brick wall.
> They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.
Oh, you very much minimised their contribution. From "when Proton came" (again, Proton came before Stadia) to "Stadia made gaming feasible on Linux" (when Proton made it feasible before Stadia)
> Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.
So, Stadie games were Linux ports. But as a result of this there are still literally no Linux ports. None of the tech behind Stadia ever made it back into software behind Proton. And "native stadia ports" are somehow responsible for more games that target Windows and DirectX to run better via Proton
> That proton was running some games is a weird revisionist take
Funny to hear this coming from a revisionist. I literally provided you with links you carefully ignored
--- start quote ---
A look over the ProtonDB reports for June 2019, over 5.5K games reported to work with Steam Play
https://www.gamingonlinux.com/2019/07/a-look-over-the-proton...
--- end quote ---
> You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.
Or because the actual heavy lifting that Valve did with Proton paid off, and not the nebulous "native ports" and code that never saw the light of day.
> because honestly this conversation is like talking to a brick wall.
Indeed it is.
Yet most games don't make use of Unreal targeting capabilities of GNU/Linux, rather Proton.
Unreal can target Linux, sure, but not all of the plugins you might use will, nor any of your own plugins.
Unreal is almost worse because their first party tools (UGS, Horde) will not work on Linux, so you have to treat linux as a console, and honestly the market share isn't there to justify it.
Which kind of validates the point of Valve's "success" in a Linux ecosystem.
Speaking from experience, Helldivers 2 and Monster Hunter Wilds both ran better on Linux from day one before any special fixes and still do - I'm not sure what "original design and battled testing" is worth or good for if the underlying Kernel and/or OS is a mess.
Stadia's impact on gaming in general is next to zero. And given that the vast majority of gaming on Linux is happening via Proton, its impact on gaming on Linux is similarly next to zero.
What games have you made to justify this statement?
I worked closely with productions using proprietary game engines, I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.
That you don’t see it as an end user, is exactly my point.
> What games have you made?
You don't have to be a chef to judge what's coming out of the kitchen.
What is the objective impact of Stadia which at its height had a whopping 307 titles [1]? At the time of writing ProtonDB lists 6806 titles as "platinum, works perfectly out of the box" and 4839 games as "gold, works perfectly after tweaks". Steam Deck alone has almost 5x the number of games with "verified" status [2].
What games are being made for Linux thanks to Stadia, and don't just target DirectX and run through Proton? How many Stadia games were ported to Linux thanks to Stadia?
Also, to put things into perspective. Proton was launched in 2018. Stadia was launched in 2019.
In 2019 there were already over 5000 games that worked on Proton. [3]
In 2022 there already were more games with verified status for Steam Deck than there were games for Stadia, and 8 times more games verified to work by users [4]. Stadia shutdown was announced half a year after the article at [4].
Stadia had zero impact on gaming in general and on gaming on Linux in particular as judged by the results and objective reality. Even the games you showed as examples don't support Linux, only target Windows, and are only playable on Linux through Proton [5]
> I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.
> That you don’t see it as an end user, is exactly my point.
It's strange to claim things like "when Proton came along" when Proton was there before Stadia and already had over 5k games working in the year when Stadia only just launched.
It's strange to claim outsized impact on development process when there are no outcomes targeting anything even remotely close to Linux development, with studios targeting Windows as they have always done.
It's strange to claim Stadia had outsized impact when none of the work translated into any games outside Stadia. When Stadia did not contribute any significant work to the tech that is running Proton. In 2022 they even started work on their own emulation layer that went nowhere and AFAIK never contributed to anything [6]
It's strange to claim that "it's actually Stadia that made Linux gaming feasible" when there's literally no visible or measurable impact anywhere for any claim you make. Beyond "just trust me".
[1] According to https://www.mobygames.com/platform/stadia/ According to wikipedia, at the time of shutting down it had 280 games, https://en.wikipedia.org/wiki/List_of_Stadia_games
[2] https://www.protondb.com/dashboard
[3] https://www.gamingonlinux.com/2019/07/a-look-over-the-proton...
[4] https://www.protondb.com/news/how-many-games-work-on-linux-a...
[5] https://news.ycombinator.com/item?id=43503018
[6] https://www.gamingonlinux.com/2022/03/google-talk-about-thei...
You are literally arguing that your ignorance is as valid as my experience. And you’re arguing that you didn't see the impact; which was kinda my entire point - there was impact beyond what was visible that propelled Proton forward.
You don’t know how the sausage is made just because you ate a hotdog.
Maybe you should consider things more carefully before making yourself look like an idiot on the internet and simultaneously raising my blood pressure.
Strange take. Proton is an acknowledgment that the windows apis are the de facto for gaming. Not sure why the runtime matters. Some games ever run better. Not sure why that’s not the ”real deal” but whatever I’m glad you’re happy with your spyware gaming OS.
Not really, I rather play on the platform they were designed for in first place.
Do you own a Playstation? :)
If you're playing the likes of Fromsoft/Resident Evil/Kojima games on a PC, be it Windows or Linux, you're not playing on the platform those games were designed for.
The problem with your reasoning is that Windows/PC doesn't need to emulate Orbit OS and LibGNM, Sony also supports Direct X and Win32 directly on their engines.
"supports" as in I see articles in the PC gaming press about technical problems with From/Kojima games a year after I've finished said games on console with zero issues.
Where is Windows Proton like for Playstation APIs?
"Technical issues" has many meanings.
The point is, "the platform those games were designed for" is the Playstation API for some titles. So you'll get the best experience on there.
Unless you play benchmarks instead of games, and care about 8k/1200 fps of course.
The point is, the game uses the platfrom APIs on the target OS, and doesn't need to emulate API from 3rd party platforms.
I'd rather use Linux and game with an imperfect translation layer, than put up with Windows.
Proton is a lesser implementation of Windows API, sure, but Windows itself is a lesser implementation of an operating system for power users.
It's not really a failure. Linux distribution and diverse ecosystem brings a level of complexity. The only way to support it long term is to either having your team continuously update and release builds of the game to cater for that which is an impossible task to ask for a lot of studios.
The initial approach of runtimes did help but it's still has its limitation.
If now a studio just need to test their game under a runtime+proton the same way they would test a version of Windows to ensure it's working under Linux it's a win/win situation. Proton becomes the abstraction of the complex and diverse ecosystem of Linux which is both its strength and weakness.
Another solution would have been everybody using the exact same distribution which would have been way worse in my opinion.
And who knows, maybe one day Proton/Wine would be the Windows userland reference and Windows would just be an implementation of it :D
So it is not really a failure when the solution is to adopt Windows and Direct X translation API?
I thought only Apple had a distortion field.
It's a complete failure across the board to create any compelling graphics APIs for desktop platforms (both Linux and Mac) beyond DirectX.
That’s not the goal though. The goal is to play games on Linux. If Valve’s goal was to end up with a Linux-specific graphics api for most games that run on Linux then they provably would have tried to do so.
Is it a failure when everyone writes javascript/html/css instead of doing native applications for non gaming?
Most of HN seems to think using a web browser as a translation layer is a good idea, yet they complain when games use a translation layer.
Is it a failure when everyone writes javascript/html/css instead of doing native applications for non gaming?
Yes?
Yes, definitely, that is why now we have ChromeOS developers instead of Web developers.
You better have made this comment via a native windows hacker news desktop application.
I would gladly have used one, if it existed without being a web widget wrapper.
I miss the days of native apps with Internet protocols, and USENET discussions.
Hacker news is a web site, not an application.
A web site makes for a crap application and the reverse.
When I was gaming on Linux, every game with a native version worked better using the Windows version in proton. I think the only exception was Factorio.
Gaming/WSL kept me on Windows for a lot of the last decade, however after Windows 10 became EOL'd and Windows started turning into ad/spyware I finally gave it up over a year ago after 25+ years on Windows Desktops.
Anyway Linux is liberating, Fedora Desktop is great, no ads in the OS, a Software Store/Installer I actually like to use, curated by usefulness instead of scam Apps. All my Windows Steam Games I frequently use just worked, I have to login to X11 for 1 title (MK11), but everything else runs in the default Wayland desktop. Although I'll still check protondb.com before purchasing new games to make sure there'll be no issues. Thanks to Docker, JetBrains IDEs and most Daily Apps I use are cross-platform Desktop Web Apps (e.g. VS Code, Discord, Obsidian, etc) I was able to run everything I wanted to.
The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh that's enhanced with productivity tools like fzf, eza, bat, zoxide and starship. There's also awesome tools like lazydocker, lazygit, btop and neovim pushing the limits of what's possible in a terminal UI and distrobox which lets me easily run Ubuntu VMs to install experimental software without impacting my Fedora Desktop.
Image editors is the one area still lacking in Linux. On Windows I used Affinity Designer/Photo and Paint.NET for quick edits. On macOS I use Affinity & Pixelmator. On Linux we have to chose between Pinta (Paint.NET port), Krita and GIMP which are weaker and less intuitive alternatives. But with the new major release of GIMP 3 and having just discovered photopea.com things are starting to look up.
I hardly find anything interesting about command-line, I grew up in a time where the command line was the only way to interact with home computers, it fails on me the interest on staying stuck in up to early 1980's computing model.
Xerox PARC is the future many of us want to be in, not PDP-11 clones.
Weird flex, most commands, utilities, server software and remote tools are going to be run are going to be from the command-line. All our System Administration of remote servers uses the command-line as well since exclusively deploying to Linux for 10+ years.
Sure you can happily avoid the command-line with a Linux Desktop and GUI Apps, although as a developer I don't see how I could avoid using the terminal. Even on Windows I was using WSL a lot, it's just uncanny valley and slow compared to a real Linux terminal.
> Weird flex, most commands, utilities, server software and remote tools are going to be run are going to be from the command-line.
It's not a weird flex. Weird flex is this: "The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh" and then listing a bunch of obscure personal preference tools that follow trends du jour.
That’s not a flex, it requires no skill to install software, they’re just some of the better tools you can install to boost productivity in Linux terminals. I doubt they’re obscure to any Linux CLI user who spent time on improving the default OOB UX of bash terminals.
And you just alias them, so you can keep using the core utility names to use them.
None of those tools are obscure, they might just seem like it from the perspective of mouse dependent vscode users.
Sadly, because many of the authors live stuck in UNIX cli model instead of Xerox PARC REPL approach.
It is like praising Ratatui for what Turbo Vision, Clipper and curses were doing in 1990's, if I wanted that I would kept using Xenix and MS-DOS.
There are huge interoperability advantages to CLI and TUI tools. Composing them, using script(1) on them, etc, are much simpler than the same for GUI tools. They are also much easier to rapidly iterate on.
GUIs are very useful but they are not clearly better (or worse) than CLIs.
REPL-ify your command line then? There's nothing that says you have to be stuck on bash for your command line needs. https://www.nushell.sh/
Already doing that a much as possible.
Gaming on windows is fine, but there's no reason to use windows for anything else. Dual boot to linux for a better desktop and none of the crud that Windows 11 has in it.
I haven't been gaming since when there was huge gap between graphical possibilities and actual design (that is beginning of 3d era) - so I do not miss that. However I can see the decline in macOS, like pushing for 'apple intelligence', more and more restricting gatekeeper, iOS-ification of desktop (ie.: mentioned system settings), constant connections to AWS, etc.
But since I'm not gaming I cannot imagine going back to Windows. On the other hand I'm quite enjoying Linux...
> So why pay more for a lesser experience
...however, with few exceptions, I haven't used mouse in decade... and I haven't found anything like MBP's touchpad yet. Maybe I just need to do better research.
> Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming
As if Vulkan had relevance to graphics programming.
> and WSL is better integrated and easier to use than Virtualization Framework.
you don't need WSL on MacOS because, well, MacOS is already a *nix environment.
> As if Vulkan had relevance to graphics programming.
It surely has on 80% of a mobile platform, and on a small handset from this little japanese games company.
> troupo 3 hours ago | parent | context | flag | on: Apple needs a Snow Sequoia
> Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming
As if Vulkan had relevance to graphics programming.
> you don't need WSL on MacOS because, well, MacOS is already a *nix environment.
Agree if everything one wants out of it is classical UNIX experience, that breaks down when having to work with containers and kubernetes locally.
> It surely has on 80% of a mobile platform,
And which platform brings in more money?
> and on a small handset from this little japanese games company.
And not on PS, not on XBox, not on PC (that is, no first-party support).
> you don't need WSL on MacOS because, well, MacOS is already a *nix environment.
Right up until you need Linux syscalls. If you're doing anything with containers it's an annoyance.
My ideal laptop would be the macbook trackpad, monitor and battery life stuck inside any thinkpad. Or just anything non MacOS, even Windows, in the macbook. I despise MacOS with every fiber of my being, but the hardware is damned good.
Windows is far worse.
I would highly recommend giving a virtualized arm Linux installation a go, using the built in Apple frameworks who are blazingly fast.
Have a look at this sample code: https://developer.apple.com/documentation/virtualization/cre...
Why not start supporting Asahi financially, if you aren't already?
i used to run debian on an intel macbook air. regular debian. was pretty nice.
Apple's software quality (either in terms of polish or just plain QA) has steadily decreased
I think the decline of software went hand-in-hand with the decline of the native indie Mac app. They still exist, but when I started with the Mac (2007), there was a very rich ecosystem of native Mac apps. Most stood head and shoulders above their Linux and Windows counterparts.
Apple has nearly destroyed that ecosystem with: race-to-the-bottom pricing incited by the App Store; general neglect of the Mac platform (especially between ~2016 and Apple Silicon); and a messy reactionary toolkit story with Catalyst, SwiftUI, etc. The new toolkits seem to imply that Apple says that it's the end of AppKit, but most SwiftUI applications are noticeably worse.
With their messy toolkit story and general neglect, developers have started using Electron more and more. Sure, part of the popularity is cost savings, since Electron apps can be used on multiple platforms. But part of it is also that a Catalyst or SwiftUI app is not going to provide much more over an Electron app. They will also feel weirdly out of place and you become dependent on Apple working out quirks in SwiftUI. E.g. 1Password tried SwiftUI for their Mac app, but decided in the end that it was an uphill battle and switched to Electron on Mac instead.
I recently bought a ThinkPad to use besides my MacBook. Switching is much easier than 10 or 15 years ago, since 80% of the apps that I use most frequently (Slack, Obsidian, 1Password, etc.) are Electron anyway. Even fingerprint unlocking works in 1Password. I was vehemently anti-electron and still don't like it a lot, but I am happy that it makes moving to a non-Apple platform much easier.
I think most of this is just downstream of the Mac being eclipsed by the iPhone in terms of Apple’s revenue. The Mac just isn’t critical to Apple’s business like it was in 2009 when Snow Leopard came out. They would have started development on SL in 2008, when the iPhone was still a fairly niche product and there wasn’t even an App Store.
Now, ios gets the executive attention and it will generally get the best developers assigned to it, and the Mac has to live with the scraps.
Yeah I think this is the one, in terms of number of users, revenue, etc. The iPhone is more than 50% of their revenue, Mac is only ~8. Lower volume and higher price, but it doesn't come anywhere near their phone. Same with tablets, although they share an app revenue income stream with the iphone which makes up for the difference in hardware sales.
By itself Apple's AirPods alone generates twice the revenue ($18B) vs Macs ($8B). So we can see where Apples priorities are.
> I recently bought a ThinkPad to use besides my MacBook.
I'm on the same boat here. Something is driving me away from my MacBook M1(Pro? Don't even know). I have a gut feeling that it's macOS but can't really put a finger on it yet.
Bought a heavily used ThinkPad T480s (from 2018) and replaced almost every replaceable part of it, including the screen. Being able to replace many parts easily is a nice touch since I am using MacBooks since 2007 exclusively. Guess that's why I somehow overdid it here. Slammed Pop!_OS 22.04 on it and I'm very pleased with the result. The first Linux desktop I actually enjoy since trying SuSE 5-something. Pain points are teams (running in browser), bad audio quality with AirPods when using the microphone and cpu speed and heat. I guess one has to stop using Apple silicon in laptops to realize how amazing these processors are.
and cpu speed and heat
Intel CPUs from that era were quite bad and everyone has upped their ante since then. I was thinking about getting a second hand from ~2021-2022, but my wife convinced me to get a new one, so I got a Gen 5 T14 AMD. It has a Ryzen 7 Pro 8840U and I rarely hear the fans, mostly only when Nix has to rebuild some large packages (running NixOS unstable-small).
> 1Password tried SwiftUI for their Mac app
1Password had a beautiful native Mac app that works to this day. Even assuming SwiftUI is actually bad, why did they have to migrate at all? What was wrong with the existing app?
I'm not disagreeing with the opinions on Apple software quality, but I think the 1Password case is more down to their taking of VC money and having to give (JS) devs some busywork to rebuild something that worked perfectly well.
1Password is also now subscription only and online only. Gone are the days of a forever license and fully offline encrypted database allowing for 3rd party syncing via iCloud or others. The death of their old app went hand in hand with their race to the bottom subscription payment VC backed ecosystem. It's only time until they suffer a breach like everyone else.
> Gone are the days of a forever license and fully offline encrypted database allowing for 3rd party syncing via iCloud or others.
While it's true for 1Password, there are other password managers. KeePass is great for local password database files if that's what you're after.
>What was wrong with the existing app?
It didn't work on Windows and Linux desktops.
Regarding Spotlight, one thing that started happening for me on Sequioa was that Finder and other apps started getting very slow to react to file changes. For example, I can save a new file to a directory, and the Finder window takes maybe 10-20 seconds before the file shows up in the list. If I navigate to a different folder and then back, the file is there. I notice the same delay in apps like IntelliJ.
I could be wrong, but apparently Spotlight is the service that drives this kind of file system watching. I think macOS has a lower-level inotify-style file system event API, which should be unaffected, but Finder and these other apps apparently use Spotlight. I really wish I had a fix, because it's just crazy having to constantly "refresh" things.
My favourite feature is when spotlight tells me that indexing is paused when I am searching for something.
You went through the effort to show some UI when something I am looking for may not be there because indexing is paused... but you didn't think to just unpause the indexing so that I can find it? I feel like I am being spit on, "Yeah, you not finding what you are looking for? I know, I'm not even trying"
I highly recommend using Alfred. I’ve been using it since before Spotlight came out, tried and then disabled Spotlight, and went back to Alfred. It’s extremely configurable but highly usable out of the box. Sort of like creating your own CLI shortcuts to open files, apps, copy things to the clipboard, etc.
https://www.alfredapp.com/
I still use Quicksilver[1], the open source app that long predates Alfred and was the inspiration for it. I tried Alfred a few years ago but didn't see anything compelling enough to switch. Am I missing anything?
[1] https://qsapp.com
I use Alfred and I used to use Quicksilver.
Probably not.
Alfred is nice. I use Raycast these days: https://www.raycast.com/.
This KILLS me. It's so frustrating. APFS is supposed to be great at deduping files and such, but in practice it seems like it really sucks. It's bad at both saving a file to the desktop and dumping a million npm files into a directory.
Same here. Spotlight used to be my everything, i.e. I never use the dock I would always use spotlight to launch applications or navigate to folders. Now it is littered with internet garbage, takes seconds to even return any results, and the results are always useless.
Who the hell thought integrating internet search is a good idea - because "aösldkfjalsdkfjalsdkfj" just as everything else is a valid search result in Spotlight now showing me "Search for aölsdkfjöalsdfjasdlfkj in Firefox"...
Spotlight was never useful, because of an absurd and glaring design defect: It doesn't show you WHERE it found stuff. There's no path shown with hits. Same blunder in Finder's search, and you can't even optionally add "path" as a column. WTF.
So... when the hits include six identically-named files, you can't eliminate ones that you know are wrong (on a backup volume or whatever). The level of stupidity here is just mind-boggling.
You hold down command to see the path.
And press command+return to open the location in Finder (and the item selected)
Where? And how is that option displayed to the user?
I also just tried it in Spotlight and Finder, and it did nothing. Which I consider a relief, because undiscoverable bullshit is worse than the feature not existing.
macOS and iPadOS are full of those undiscoverable "if you do this combination of buttons/swipes while at full moon, something happens". As a Mac user not by choice (work issued) I hate how impossible to discover these are.
As a Mac/iOS/iPadOS user it seems that it’s almost mandatory to watch each Keynote / product announcement video if you want to keep up with new features. Lots of cool features that I only knew about by watching those videos that are completely undiscoverable otherwise.
These kinds of shortcuts are part of Apple software as a whole, and apparently have been a thing since at least OSX. These behaviors were supposed to be covered in the documentation, but I don't know how true this is nowadays.
Special mention to all text input fields in macOS having Emacs-style shortcuts.
It goes back further than that. I remember being able to buy key-combo cheat cards for System 7, and I have no reason to think the shortcuts they covered wouldn't also have been present in System 6.
It's in the documentation for Spotlight:
https://support.apple.com/en-gb/guide/mac-help/mchlp1008/mac
I agree that discoverability could be better, but macOS has pretty consistently had hidden power user shortcuts and modifiers, to keep the basic workflow streamlined/simple for those who don't need it.
Seeing where stuff is in a search function is not a "power user" feature; it's the whole point of what you're doing.
And I don't buy the "keeping things simple" excuse for secret hotkeys in other areas. Falling back on gimmicks like undisplayed hotkeys and "long presses" and "gestures" is lazy abandonment of the design task.
I hate this "saving the user from complexity" lie. It's hypocritical: The "non-power" user isn't going to go looking for these options in the first place.
Finder search is a great example. A "non-power" user isn't going to right-click on the column headings in the results and try to add "path" as a column. So how does it help that user to deny everyone else the ability to add it?
Apple mocked IBM for needing a thick user manual back in the day. To suggest that anyone (especially anyone on this site) should have to read documentation to use perform a basic file search (in a GUI, no less) is apologism to the extreme.
> There's no path shown with hits
I guess you do know the path is shown at the bottom of the window if you select the filename in the list of results?
Yep, but that's totally unacceptable because you have to tediously select every entry, one at a time, and peer at the status bar.
It also doesn't allow you to sort results by location, as you could if it were a column.
In all fairness, you do need to hold down the command key to show the file location in Sequoia. It is an interesting default behavior to pretend the files location doesn't exist, mobile-centric.
No you don’t. In Finder search results, the path is always shown at the bottom. For regular Finder windows, you can optionally show the path with “View -> Show Path Bar”
Not a solution, because, again, you have to click on every single entry one at a time, and you can't sort by it.
In all fairness, secret hotkey BS may as well not exist. Are you supposed to mash every modifier key and every combination thereof on every screen and in every menu, looking for hidden goodies?
Absurd.
we have simplified the interface to just one home button and the screen interface, as well as the volume up/volume down key.
To select, just press on the item.
To hover, press and hold for at least 2 seconds.
To get a list of options, press and hold for at least 2.5 seconds, but not more than 3.5 seconds.
To delete, press and hold for 3.6 seconds, but not longer than 3.9 seconds.
To save, press and hold for 4.1 seconds. Pressing and holding for exactly 4.0 seconds activates the archive action. Pressing and holding for 4.2 or more seconds sends the item to the blocked list.
To retrieve the list of items in the blocked list, press and hold and simultaneously press the volume up and volume down key.
To delete all items in the block list, press and hold and simultaneously press the volume up key only.
To completely reset your device, press and hold and simultaneously press the volume down key only, whilst holding the device in a completely vertical plane, and rotating clock-wise and counter-clockwise, smoothly, at precise 2.35619 radians every 30 seconds.
To trigger the emergency call feature, drop the device at an acceleration of no less than 9.6m/s and no more than 9.7m/s
/s (kind of)
No: you are supposed to read the documentation to learn about power user features. Microsoft also doesn’t shove the advanced keyboard shortcuts in your face; you need to read the manual to learn stuff like this.
Showing WHERE things are found when you do a search is not a "power-user" feature. It's an essential aspect of what the user is trying to accomplish.
The whole point is that secret hotkeys are design dereliction.
Is it, though? Most people don’t really have a notion of the file system, or hierarchical file structures. They drop files onto their desktop, or keep them in the downloads folder. Just ask a parent or your next-door neighbour.
That’s a bit of a problem when discussing problems of normal users with power users, because they don’t even realise how what they’re doing is actually not what normies do.
I’m inclined to agree that hotkeys in MacOS are hard to discover, but cluttering the interface with stuff many users simply do not need cannot be the correct answer.
If it's cluttering the interface, the interface design was incompetent to begin with.
That’s just ridiculously broad. You cannot cram an infinite amount of information into an interface, there is a maximum density for your design goal.
Spotlight is unbelievable bad, especially on iOS. If I type a substring of the name of an installed app, it should find it effectively instantly (say, within 1-2 frames of the input showing up). Instead, it finds it sometimes. On occasion I need to hit backspace (removing a letter that should match) to get it to find it.
I struggle to imagine the software design that works so poorly.
I've yet to find a decent implementation of search-as-you-type anywhere, not just Spotlight. I have that same issue on Firefox, and with Windows Search, for example.
And it makes no sense whatsoever. If "foo" matches "foobar", so should "foob". I honestly don't know how the hell can they still f up such a simple piece of technology in 2025.
> I've yet to find a decent implementation of search-as-you-type anywhere
https://www.voidtools.com/en-uk/support/everything/
Windows 7 start menu search was always reliable and had predictable behavior from my experience. It can be done, just that modern software engineers' skills and career incentives no longer permit it.
Finder search is just as bad. You can be viewing a directory full of JPEGs, all with the jpg extension.
Then you do a search for .jpg, and get NOTHING. But only sometimes. Other times it'll work.
See this same search issue in everything these days for what was a solved problem a decade ago, what “best practice” is causing this
Wow, I feel like I almost could have written this except I prefer Plasma/KDE to GNOME. I use Linux + Mac laptops somewhat interchangeably since 2012, and have also seen the marked decline in quality. In fact, it seems like Linux has gotten better at almost the same pace (or maybe a bit faster) than macOS has gotten worse.
The things that most frustrate me about Macs is that they've violated the never spoken but always expected "it just works" in so many ways. Things like how Thunderbolt Displays containing a USB hub which are Apple-certified handle re-connection to a Macbook, should "just work", but require fiddling every time. That's just one of numerous examples I could come up with.
Apple historically was probably the best company in the world in understanding the full depth of what "User Experience" means, and it seems like they've really retreated from this position and are regressing to the mean.
I've been using Macs since Mac OS 9, and Snow Leopard was indeed very good. It remains my favorite version of Mac OS. I actually think it was Snow Leopard that started the rush of developers to Mac as _the_ platform to use.
Exactly.
People don't want animojis, and they don't want other trite new features that only seem to exist because Apple feels it needs to demo something new every year.
What they want is something that just works without annoyances, distractions, failures, or complications.
Give them that and they'll break down the doors trying to get their hands on it, because it's so far from how most tech works today.
Animojis really feel like peak corporate board asking, "What do the kids like these days?" and dumping that shit into the world. Honestly ... AVERAGE age of the Apple board is 68!! This is a company that's reached some sort of corporate red giant stage where it's influence is massive but it's ability to grow is over and it's only real purpose is to generate heavy metals and seed them throughout the rest of the universe after it's eventual explosive death.
To be fair, I'd wager the average of nearly all Fortune 500 companies boards hovers around the 65 mark
Something that just works and is stable is bad business for companies these days.
Why would it be bad business for Apple? Their business model is based on selling a holistic ecosystem. They don’t have any need to chase new features and there steady stream of high margin hardware revenue is at stake.
> Their business model is based on selling a holistic ecosystem
Yeah and they succeeded in that so now it's about selling subscriptions on top of that.
Spotlight straight up broke on both of my Macs after Sequoia. It can't even find exact matches in many directories marked for indexing and re-indexing did nothing. Just searching for apps under Applications doesn't seem to find all apps.
I’ve had so many issues with it as well! To the absurd level where I could not search for settings in the Settings app… People all over the net have had all kinds of issues and there’s never been any help other than „oh go and reindex”.
iOS has this problem as well. You search for a setting in the Settings app. It’ll say “doesn’t exist” (or whatever) while it’s looking for something extremely obvious (like “software update”) instead of just showing a processing icon.
Then when it does show the results, they’re usually in some terribly unhelpful order. It took me ages to try and go through the CUJ of “this app isn’t sending me notifications because I turned them off now I want them back on”
Just yesterday I was trying to find a file in Finder, using the search, and it could not find it even though I was just one directory up from the directory it was sitting in. It made no sense to me at all. Reading these stories, it’s clicking for me.
It’s a relief to hear this is common. I thought this was user error or a consequence of frequently filling up the internal SSD thus nuking the index.
Just adding a "me too" here, Spotlight used to be incredible. Now it's basically only good if you wait 5-10 seconds... sometimes.
I gave up on it because of this and installed Raycast which seems a lot more reliable. I used Spotlight effectively as my launcher for apps/settings, and have the Dock completely hidden and Spotlight set to hide everything else. But when it can't even do that consistently, I have no idea how!
The nice thing is that there are several apps which replace it and do a lot more at the same time. (Like LaunchBar, Raycast, Alfred)
I can't believe I'm saying it, but I agree with you about GNOME being my forever desktop. I used to really make fun of GNOME during the 2->3 transition, which seemed so profoundly misguided, but now I love it. I don't know if they've massively improved it or if my perspective has just changed with time.
Unified button that disguises as two different icons hiding other useful options
You can only cycle windows in one direction even if you try to do some remapping
Choosing keyboard languages hides a lot of options. Once you understand you need to click on English US to see more detailed options then you get them all, UK, Canadian... Then it's unclear which keyboard layout is currently selected and how to select one from the list you made.
I can't fathom how a DE whose is all about human machine interface guidelines whatever and supposed to be the epitome of UX can't figure out basic stuff about discoverability and clarity
Default keybindings have Shift+Super+Tab doing reverse window cycling in GNOME. Just tried it. Also, which unified button masquerades as two icons?
Keyboard layouts are a pain, but there are some solid extensions that clean the flow up and may be upstreamed into GNOME at some point.
It's all opinions, but boy, compared to the mess that is macOS and iOS regarding discoverability ... I'll take GNOME any. day.
True ! So why can I only remap cycling window in one direction and not the other ... ?
The volume and power icons on the top right is actually one button and hides other option like screen lightning volume and wifi etc. If at list they had made a three vertical dots/stacked bars and is the convention for hamburger menus...
From what I heard GNOME devs do not like change and it sucks to be a GNOME extension developer, a quick google search seems to confirm that so it casts some doubt about them up-streaming any of them but maybe you know better. Has it ever happened to other extensions ?
https://discourse.gnome.org/t/developing-gnome-shell-extensi... https://www.reddit.com/r/gnome/comments/pvvku5/why_do_extens...
Haven't really used MacOS or iOS more that five minutes so I can only trust you on that.
On the other hand for example, it is very easy to remap CapsLock to Escape on MacOs. Just go to Setting --> Keyboard and you easily find the option. GNOME ? No, not in settings. Wait I have to use an app called gnome-tweak ? Ok it's in "Advanced keyboard otions" --> Opens a big list of loosely classified options. Oh well it was in miscellaneous category.
I can believe that its easy to bounce off software because of a million paper cuts. But the problem with them trying to address every one of those proactively is that GNOME is a huge undertaking and they do their best to move at a fairly slow pace (now, after the 3 transition, which was akin to ripping a bandaid off ... go fast, piss the person off, but then the bandaid is gone).
I don't know if the CapsLock -> Escape switch is on a roadmap somewhere, but that is a little bananas. That said, my partner comfortably uses GNOME every day to browse the web and manage some files. Has she EVER wondered how to remap CapsLock? No. The people who do want to? Google can give you the answer pretty quickly. Not saying it's good UX, but GNOME balances a lot of use cases, and as this thread suggests, I think they've actually (with a LOT of complaining from engineers and power users) kept that balance pretty damn well to the point where I haven't been surprised by GNOME is a long time, and seems to slowly and progressively get better.
And yes, whoever jumps in here with their own papercut story, I know there is pain in not being the primary audience for software. But honestly, at least I'm in the same Venn diagram with my partner. The primary audience for macOS or iOS now appears to be ... I don't even know anymore. Used to be content creators, now it seems like even Apple doesn't actually know who uses their computers.
It's not just you, the early GNOME 3 releases sucked. It has seen a lot of gradual improvement over time. Of course there are reasonable alternatives also, such as Xfce, MATE or Cinnamon. (And these three 'alternative' desktops have also edged closer over time, sharing more and more of the underlying tech stack even as GNOME itself has continued to develop in a rather seperate direction).
It could be a third option, the bloat in other OSes has made a less bloated OS look very pleasant and useful.
Did you know, you can set your wallpapers to be continuously updating and make macs use terabytes of your network in hours or days depending on speed? https://discussions.apple.com/thread/255329956
I also wish I could preview the wallpapers without triggering a 100MB download. There's nothing in between the 320x240 thumbnail, and the 4k video.
And so many tiny thumbnails wedged into the too-narrow System Settings window.
My biggest annoyance with recent macOS versions that most QuickLook plugins stopped working. Apparently one could re-develop them with their new framework-of-the-day, but I have no doubt a lion's share of what I'm using will just become abandonware.
At one point a few years ago, Spotlight improved enough that I could use it instead of relying on Alfred. So I deleted Alfred, and whaddaya know...a few years later Spotlight got worse and worse, making me regret that move.
I have been using a Mac since the 128k came out. System 7.5.3 and Snow Leopard 10.6.8 are in my opinion the high water mark for both OS’s.
I still have some 10.6.8 install media for both server and client. Truly loved them both.
I worked at Apple Retail during the Snow Leopard launch. I think I still have a boxed disk somewhere, too. I remember it was not a product I had to sell to customers. People came in asking for it.
Another highlight of that job was selling a green iPod Nano to "John Locke" from LOST
Those were the days…
The most ridiculous thing that happend to me was in the early days of the Apple Store in SOHO I stopped in to see if I could just buy RAM.
The music was loud so it was like I was speaking loudly to be heard and asked for RAM and the they thought I was asking if I could buy a gram.
Interesting that you think that of 7.5.3 — it worked, sure, but it could be painfully slow. System 6 was preferable as an OS — MultiFinder was better than 7, at least in the first couple iterations — but much of the software I needed demanded 7. 7.6.x was the first bright spot since 7.1 fixed much of what went wrong in 7.0, & there was a ton of waiting after that. 9 just chugged along for me, for the most part, which was nice.
Loved Snow Leopard too, & was shocked by how bad Lion was in comparison. Glad they got back on track after that.
System 7 was better for me due to AppleTalk file sharing. System 6 was confined to LocalTalk or printer sharing.
You were right I forgot about 7.6.1. I think I had a an WiredInc mpeg video card server based on System 7.5.3 for a project. So it had a particular memory burn. I suppose I need up using System 9 since all life forms were supported by carbon.
> System 7.5.3 and Snow Leopard 10.6.8 are in my opinion the high water mark for both OS’s.
Wasn’t 7.5.3 the worst of the string of terrible releases between 7.5 and 7.6? In my memory 7.5.5 was much better, but I still preferred 8.1.
> and there is no longer any way to effectively prioritize the results I want (apps, not internet garbage)
OMG this one drives me bonkers. If anyone out there knows how to turn off internet results, please share!
Open Settings, scroll down to Spotlight. Unselect the things you don’t want.
You’re welcome :-)
Just noticed that I was sharing my entire Safari, Spotlight and Siri search history in that menu. Why is that setting in Spotlight settings and not under Privacy/Analytics?
Because spotlight indexing is local and not shared with Apple?
Like I pointed out elsewhere, it doesn’t stick for me.
I've found that kind of thing is often caused by a damaged preferences file. The easy way to check that is to make another user account, and see if it happens there too.
Thanks!
> As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
Similar for me but started in system 7.
It’s lucky for Apple that Windows has got worse faster.
Spotlight seemed to go from great to unusable in 5 years
8 years mac user here, never use Spotlight, it's a trash.
Yeah I stopped using spotlight a few years ago. I didn't really notice that I stopped using it until recently. It just became useless. I reorganised my stuff carefully so I know where I put it. I think that turned out to be more powerful than hoping a search engine over the top would be able to sift through the nuances.
... I'm not sure I've ever found any GUI system search reliable enough that I've used it on-purpose (though accidentally, sometimes) on Windows, Linux, or Mac. I always just use "find" and "grep" (and ripgrep when I remember that exists and realize I just grepped a lot and will be waiting for like an hour if I don't re-run the command with rg instead). Or nothing on Windows, which is fine because I haven't used Windows for anything but launching video games in about two decades.
"everything" app on windows works well for me for file search. Incredibly fast.
^^^ this so hard. Voidtools Everything is how I find what I need 90% of the time now.
https://www.voidtools.com/
I've only started using gui search since using fedora. the tracker search in the activities view is fast and finds files in the home folder by name pretty well. The only shame is that the pdf-content search doesn't work in the main search interface but only when searching in the file manager.
Windows 11 LTSC one is quite good because it's so damn stupid. You can indeed hit start then just type what you want. Only does files though, by name, which is fine.
Windows 11 is beyond pale. It’s infuriatingly bad. But it could be a benefit if you do a bit of manual organizing and ignore most of its dumb features. Only use it for work, I will never use it at home.
Try the LTSC version. All the infuriating bits are not installed :)
Alfred
Spotlight was bad back in the day, so I installed Alfred and started using that. Then Spotlight suddenly improved a lot, enough that it was usable for me, and I deleted Alfred. Then about five years ago something happened internally at Apple to the Spotlight team and it just got worse and worse and more difficult to use, making me regret deleting Alfred.
I wish Apple would just fix Spotlight. They don't seem to think it's worth fixing.
I wonder if Apple has internal metrics that most people just stick everything in the dock and on desktop and don't use Spotlight
That is a good question. I like my dock uncluttered. I have it placed vertically on the left side, with only the apps I use every single day: Alacritty, Brave, Cursor, and Zoom. With Finder and Launchpad included, that's only six docked apps. Everything else I use Spotlight to open, so I feel the pain when the usability gets degraded or buggy.
> There are some factual "gaps" there about how good Snow Leopard was
Here are some data points I collected at the time:
https://blog.rongarret.info/2009/08/snow-leopard-is-disaster...
https://blog.rongarret.info/2009/09/esata-on-snow-leopard.ht...
In retrospect Snow Leopard deserves the love it eventually got, but at the time it was not entirely clear.
> Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
Amen to that.
> prioritize the results I want (apps, not internet garbage).
Settings -> Spotlight -> Websites, UNCHECK
Does not work. I happen to know a fair bit about mdutil and the like and confirmed that does exactly nothing for my particular issue. A full Spotlight index reset works temporarily, but after a while it just conks out again.
Also, I vaguely remember there being a way to _order_ results, not just disable them.
> Also, I vaguely remember there being a way to _order_ results, not just disable them.
Good memory! Apple removed this feature in El Capitan.
I'm ancient by today's AI standards :)
Yea this has been happening on all of my family’s MacBooks. Spotlight indexing just hammers the CPU, and also seems to be doing nothing at all.
It seems to churn the index. I have no other explanation for its behavior. And of course filing Feedback in Apple's tools doesn't help.
I have a vaguely related, kind of interesting story related to search indexes, but on windows instead of mac.
My C drive was was super full for some reason I couldn't understand, and Explorer couldn't tell me where the data was. There was about 100GB just unaccounted for.
I don't even use the search index.
I just wanted to say that I've been a keen reader of your blog for ... I guess, decades. I appreciate your work. Thank you.
Thanks!
I do love Gnome. If only we had hardware to run it :/.
I'm stuck on a MBP because it's the only laptop with a great screen, speakers, and battery life. Meanwhile my keyboard keys keep getting stuck after a year of usage, and OSX is garbage. Soon as there is similar hardware I can load Linux on, I'll be insta-switching.
AMD AI Max 395 (superb name) proved that x86 can get to Apple silicon perf. (even not that far in power efficiency), but there seems to be 0 devices from non-trash brands (I am not buying ASUS or HP).
I would love to finally get out of Apple ecosystem, I just don't have any decent alternatives right now. Hopefully next year.
> consider it to be my "forever desktop"
Feelings shared, if only Gnome would provide this column-based file navigation that I miss so much
I'm going to purchase a framework just because I value repairability. And honestly, before the m1 macbook I was using a t480s, and I'm okay with compromising on hardware, esp. having been burned with the 2016 butterfly macbook. Apart from the haptic touchpad I wouldn't miss much, other makers are finally ditching low resolution 16:9 screens and you can even find nice oleds. I'm mostly missing the polished software that's only available on macos (things like carbon copy cloner or pixelmator). But with my m1 having degraded battery and having to send it off for a week or two to the nearest service center just to get a new battery, the prospect of a repairable laptop like framework where I can just order a new battery and replace it myself is looking all the more enticing.
I personally think that it is reasonable to "want" an Apple notebook. They have great hardware, great battery life and an ecosystem where every device integrates. Only on macOS you can nicely develop software for iOS. Furthermore most vendors release software for macOS, while they don't for Linux (not only Adobe). BTW the apps I miss most on Linux is the Preview App and Apple Mail.
However I'm done with Apple. I think it's a decision - not "reasoning". That decision takes time and is painful. It's also a decision specifically against "the best" ecosystem available in favor of something "ok".
Not only they repeatedly disappointed my expectations - they just suck as a company (in my opinion). It's not about being less innovative for decreasing software quality, they have done so much for the market, that I think GNOME wouldn't even exist as it is without them... Its about sealing off every inch of their software and hardware they can. No repair without paying... Making RAM and SSD upgrades ridiculously expensive, you cannot even put default NVMe drives into a mac mini - everything is proprietary. Even their sensors have serial numbers to prevent hibernating if you change them out without "hacking" the firmware.
Hardware-wise I have high hopes for framework working with AMD - although they did not address the issues I'd suggest (speakers, lpcamm2), they're constantly improving without breaking their promises. This is hopefully not going to change when they get bigger.
OS-wise I'll stay on Linux. After a long journey going from Ubuntu to Debian to Fedora using GNOME, KDE and even NixOS with Hyprland for a short period, I gained enough knowledge required to really enjoy Linux. System76 is working on COSMIC, which could be pretty amazing, once it is released.
In case anyone would like to try my current Linux config, I'm constantly working on an "install everything" script (pretty early stage):
https://github.com/sandreas/zarch
HF ;)
Apple delivered on Steve Jobs' vision of an "appliance computer".
You might not want one though.
Yeah... probably. I forgot to mention that Apple computers are a pretty good deal if you are looking for an AI / LLM experimentation machine due to unified RAM which nearly translates 1:1 into VRAM.
"Apple is ripping you off on DRAM" vs. "Apple is a great deal for VRAM." ;-)
You don't get nearly as much compute as you would with 6 GPUs, but it also uses less power than a single GPU.
Oh wow. I am realizing I have just been living with these bugs as tiny frustrations all day long not understanding how pervasive they are!
This issue with spotlight is so bad. I use the switcher to pull up my Downloads or Documents directories and half the time it can’t even find them!
Of course Asahi isn’t an option. The hardware support is far from finished.
Indeed. I complained that Apple design gets a free pass while being haunted by Steve from beyond the grave for a decade. Your comments resemble my habits except rusted sway right into cosmic desktop alpha and done.
> no longer any way to effectively prioritize the results I want (apps, not internet garbage)
FWIW you can massively improve things by just disabling the internet results. It's easily done in the System Preferences
I would much prefer if you could change the order so that _local_ results come first, web results after — not possible (anymore). Sad.
Like I pointed out elsewhere, it doesn’t stick for me.
> if PC hardware can ever match Apple Silicon
IIRC some competitors are starting to offer a few laptops with ARM processors, I think Samsung has a few. How do you feel about those?
I generally agree it’s decreased steadily but I also remember macOS 9 and early X versions especially being pretty buggy and having awful performance.
It's wild how much of the original "it just works" ethos has eroded
> As someone who's been a Mac user since System 6 and has been consistently using Macs alongside PCs _daily_ for over 20 years I can say that Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
Similar for me but started in system 7.
It’s just lucky Windows has got worse faster.
Honestly yeah, Raycast is the replacement I'd recommend these days for spotlight.
And now spotlight defaults to the whole computer even when I start a search within a folder... for items in the folder... Turned to garbage sometime in the last ~18-24 months.
At least there's quicksilver
If only there was a good Linux version of Alfred.
Apple (at least current leadership) is programmatically degrading it's products so people would buy a new one. Who expects anything good from such a team.
That statement makes no sense, as the new products are worse than the old ones
Umm the new products are better than the old ones. Not sure if you are being nostalgic.
I keep being tempted to write same post but named "Does all software work like shit now?", because I swear, this is not just Apple. Software in general feels more bugged as a new norm.
Most websites have an element that won't load on the first try, or a button that sometimes needs to be clicked twice because the first click did nothing.
Amazon shopping app needs two clicks every now and then, because the first one didn't do what it was supposed to do. Since 3+ years ago at least.
Spotify randomly stops syncing play status with its TV app. Been true for at least a year.
HBO app has subtitles for one of my shows out of sync and it has been for more than a year.
Games including AAA titles need few months post-release fixing before they stabilize and stop having things jerk themselves into the sky or something.
My robot vacuum app just hangs up forever once in a while and needs to be killed to work again, takes 10+ seconds after start to begin responding to taps, and it has been like that for over 2 years of owning the device.
Safari has had a bug when opening a new tab and typing "search term" too quickly, it opens URL http://search%20term instead of doing a Google search. 8 years ago I've opened a bug for that which was closed as a duplicate, and just recently experienced this bug again.
It really seems that criteria for "ready for production" is way lower now. If my first job 13+ years ago any QA noticed any of that above, the next version wouldn't be out until it is fixed. Today, if "Refresh" button or restarting the app fixes it, approved, green light, release it.
Something I found annoying at a previous big-tech work, was how the focus on top-level metrics (read, revenue-linked metrics) meant we couldn't fix things.
There were a lot of smart people, very interested in fixing things— not only because engineers tend to like fixing things, but also because we, and everyone around us, were users too.
For example, many things related to text input were broken on the site. Korean was apparently quite unusable. I wanted to fix it. A Korean manager in a core web team wanted to fix it. But we couldn't because the incentive structures dictated we should focus on other things.
It was only after a couple years, and developing a metric that linked text-input work with top-level (read, revenue-linked) metrics, that we were able to work on fixing these issues.
I find a lot of value in the effort to make incentives objective, but at a company that was already worth half a trillion dollars at the time, I just always felt there could be more room for caring about users and the product beyond the effects on the bottom-line.
This is exactly the problem. Hyper efficient (or at least trying to be) businesses have no room for craftsmanship. If you take the time to make quality software, you’ll be left behind by someone who doesn’t. Unfortunately the market doesn’t care, and therefore efficient businesses don’t either.
The only solution I know of is to have a business that’s small enough and controlled by internal forces (e.g. a founder who cares) to pay attention to craftsmanship.
You're implying that buggy software has no impact on the bottom line. I'm not so sure. Users weigh the availability of features against the quality of features. Getting bugs fixed is not necessarly the highest priority for users either. It's a trade-off.
Our use of Microsoft 365 is a pretty good example of that. I moved our company to Microsoft 365 because it had some features we wanted. Then I moved the company off Microsoft 365 because it turned out to be too buggy to be useful.
I realise that the actual users of software are not necessarily the same people making the purchasing decisions. But if productivity suffers and support costs rise then the consequences of choosing low quality software eventually filters through to purchasing decisions.
Even if buggy software has an impact on the buttom line, managers can continue pretending it doesn't and not allocate any budget to fix them. They assume bug fixes somehow will be squeezed in between the work they really value - new features or better completely new projects. Because creating something new (asking debelopers to create) is the easiest way for a manager to get a promotion. It was many years ago when I last seen a manager (with the power to set priorties and not just translate them form above) who pays more than a lip service to quality and cares about maintenance.
> You're implying that buggy software has no impact on the bottom line. I'm not so sure. Users weigh the availability of features against the quality of features.
The problem is that managers / those that determine priorities don't get the numbers, they don't see a measurable impact of buggy software. There's only two signals for that, one is error reporters - which depend on an error being generated, that is, software bug - and the other is user reporting, but only a small fraction of users will actually bother to make reports.
I think this is a benefit of open source software, as developers are more likely to provide feedback. But even then you have some software packages that are so complex and convoluted that bugs emerge as combinations of many different factors (I'm thinking of VS Code with its plugins as an example) that the bug report itself is a huge effort.
>The problem is that managers / those that determine priorities don't get the numbers, they don't see a measurable impact of buggy software.
I don't believe that. IT departments have to support users. Users complain and request support. It costs money and it affects productivity and everybody knows it.
But that's not enough. You would also have to believe that there are significantly less buggy alternatives and that the difference justifies the cost of switching. For big companies that is an incredibly high bar.
But small companies do dump software providers like my company dumped Microsoft.
[Edit] Ah, I think I misunderstood. You're looking at it from the software provider's perspctive rather than the user organisation. Got it.
> You're implying that buggy software has no impact on the bottom line. I'm not so sure.
The problem is that very little competition exists for computer operating systems. Apple, Google, and Microsoft collectively control nearly all of the consumer OS market share on both desktop and mobile. Thus, macOS just needs to be "better than Windows", and iOS just needs to be "better than Android".
> Then I moved the company off Microsoft 365 because it turned out to be too buggy to be useful.
What did you move to?
In general, Microsoft 365 is extremely successful, despite any bugs. There doesn't appear to be any imminent danger of financial failure.
Software vendors also face tradeoffs, engineering hours spent on fixing bugs vs. writing new features. From a bean counter's perspective, they can often live with the bugs.
> In general, Microsoft 365 is extremely successful, despite any bugs.
That's because of some very hard monopolistic anti-consumer behavior from Microsoft in their ecosystem.
> You're implying that buggy software has no impact on the bottom line.
I'm not implying that, and I don't think my manager was implying that either. I think rather there were 2 things going on:
1. It's often hard to connect bug-fixing to metrics.
A specific feature change can easily be linked with an increase in sales, or an increase in usage. It's much harder to measure the impact of a bugfix. How can you measure how many people are _not_ churning thanks to a change you pushed? How can you claim an increase in sales is due to a bugfix?
In your case, I'm sure some team at Microsoft has a dashboard that was updated the minute you used one of these features you bought Microsoft 365 for. How could you build something similar for a bugfix?
Bugfixes don't tend make the line go up quickly. If they make the line go up it often is a slow increase of regained users that's hard to attribute to the bugfixes alone. Usually you're trying to measure not an increase, but a "not decrease", which if possible is tricky at best. The impact is intuitively clear to anyone who uses the software, but hard to measure in a graph.
2. A ruthless prioritization of the most clearly impactful work.
I wouldn't have minded working on something less-clearly measurable which I nonetheless thought was important. But my manager does care though because their performance is an aggregate of all those measurable things the team has worked on. And their manager cares, and so on and so forth.
So at the end of the day, in broad strokes, unless the very top (which tends to be much more disconnected from triage and edge-cases) "doesn't mind" spending time on less measurable things like bugfixing, said bugfixing will be incentivized against.
I think we all know this impacts the bottom-line. Everyone knows people prefer to use software that is not buggy. But a combination of "knowing is not enough, you have to show it" and "don't work on what you know, you have to prioritize work on what is shown", makes for active disincentivizing of bug-fixing work.
> first job 13+ years ago any QA...
such QA jobs no longer exists. Ever since the software dev world has moved to doing one's own QA during development, software has been consistently worse in quality. May be there's a correlation there!
The problem is Agile. Not the way it was intended at some point, but the way it has become through Agile consultants and SAFe. Also the fact that it's become the default for any project and that Waterfall has become a bad word.
Companies abuse Agile so they don't have to plan or think about stuff anymore. In the past decade, I haven't worked in (or seen) a single team that had had more than 2 weeks of work prepared and designed. This leads to something build 4 weeks ago needing a massive refactor, because we only just realized we would be building something conflicting.
That refactor never happens though, because it takes too much time, so we just find a way to slap the new feature on top of the old one. That then leads to a spaghetti mess and every small change introduces a ton of (un)expected issues.
Sometimes I wish we could just think about stuff for a couple of months with a team of designers before actually starting a multi-year project.
Of course, this way of working is great when you don't know what you'll be building, in an innovative start-up that might pivot 8 times before finding product-market fit. But that's not what many of us in big corp and gov are doing, yet we're using the same process.
I couldn’t agree more. I’ve had literal conversations with tech leads who say “no, we aren’t going to talk about database design, we’re agile”.
Not even architecture is being discussed properly under the guise of being agile, it’ll come by itself.
Absolute insanity.
This, 100%. Agile (properly done, for whatever value of “proper“ you choose) is fine for websites, apps, consumer facing stuff. For things that must work, in predictable fashion, for years, it’s often inappropriate.
OS work is somewhere in between, but definitely more towards the latter category.
The underlying cause of this is online software updates. Knowing you can fix bugs any time removes the release date as _the_ deadline for fixing all egregious bugs. And so the backlog of bugs keeps growing.
The backlog is down to management and priorities, not testing per se.
Depends where you look. There's been a QA process in all the (agile, some very forward-thinking) teams I've worked with for the last decade. That QA might be being done by other devs, but it's always been there.
[flagged]
You’re not wrong. I’ve assumed it’s a side effect of the way the industry deals with career advancement. If you’re an engineer or middle manager, you aren’t going to get a promotion or bonus if you say “we took feature X and made it more stable without introducing any new functionality”. The industry seems to favor adding new features regardless of quality so the teams that do it can stand out and make it look like they’re innovating. This isn’t how it has to be: if companies would recognize that better doesn’t necessarily mean “more stuff” or “change”, then people could get rewarded for improving quality of what already exists.
I think the financial cost of these bugs is pretty low and the cost to employ people to fix all of them is pretty high. Everywhere I’ve worked, there is a huge backlog of known issues that are agreed upon that we probably just won’t ever get to them. And we certainly aren’t going to hire new people to solve them. It’s probably because the systems we build are getting way overcomplex due to feature piling and promotion seeking complex projects to show off. If these bugs were trivial to solve, they wouldn’t exist. The fact is, these are pernicious bugs because of how complicated everything is.
I actually got penalized in my last performance review because something I shipped “wasn’t that technically complicated”. I was flabbergasted because I consider it my job to make things simpler, not harder to reason about. But you don’t get promotions for simple.
I remember software working really badly in the early 2000s, when Microsoft had an unassailable monopoly over everything. Then there were a bunch of changes: Windows started getting better with Windows 7, Firefox and then Chrome started being usable instead of IE, and Google and Apple products were generally a huge breath of fresh air.
Since then, Google and Apple products have become just as bad as Microsoft's. I think this is because the industry has moved towards an oligopoly where no one is really challenging the big players anymore, just like Microsoft in the late 1990s. The big companies compete with each other, but in oblique ways that go after revenue not users.
Few things manage to make me as angry as a link (even if shown in form of a button) which does not open in a new background tab when clicked with the MMB.
Preloading selected results in background tabs and then closing the main tab, so that I can iterate through the results of each clicked item per tab is simply so much more efficient than entering a page, hitting back, entering the next, hitting back, ...
Like the items in Twitter's Explore page.
>which does not open in a new background tab when clicked with the MMB.
Which you notice because your page scrolls up wildly as you move to click on what should be the new tab
It's true. One example I can give is how Gmail used to automatically recognise flights and hotel bookings and add them to calendar.
It was suddenly completely broken and stopped working a few years ago. I tried every setting to try to get it working but couldn't.
I feel like a stone age caveman having to manually type everything into my Google calendar.
There are a lot of people raising the same issue in Google forums, but it's not fixed yet.
Ironically they are adding new Gemini AI features into Gmail, which can't do this as well.
With regards to Google Flights, I seem to recall that there was some European Digital Markets Act occurrence. Google decided to comply with it in a malicious fashion.
Ironically Linux Desktop environments have never been so robust.
As much as I dislike systemd, if this is the reason, then I retract everything negative I ever said.
It's hard to argue that systemd isn't a part of modern Linux robustness! It's not the only way it could have been done, but the more declarative model is absolutely better than shell script exit codes. Daemons don't have to worry about double-fork. User-level services are incredibly valuable.
Seconded about the desktops: currently loving KDE Plasma over here. Less sure about systemd.
>Safari has had a bug when opening a new tab and typing "search term" too quickly, it opens URL http://search%20term instead of doing a Google search. 8 years ago I've opened a bug for that which was closed as a duplicate, and just recently experienced this bug again.
While webkit might have some much needed improvements in the past few years, it is still the behind Blink and Gecko. Safari, the browser itself. Has been awful for the past 10 years. At least on Desktop. And some of these are not issue with Webkit because other webkit browser does it better.
The Address bar is far the worst compared to Chrome ( OmniBar ) and Firefox ( I believe it used to be call Awesomebar ). I have experience the same bug you mentioned and I believe I filed it way earlier.
Opening Bookmarks with too many items continue to pause and jank for 11 years now.
Tab Overview continue to re-render all the tabs. Causing Paging and Kernel_Task CPU spike. My Kernel_Task is currently 80TB at 240 days uptime. That is 333GB of write per day. Simply killing the SSD.
And no Tab Sleeping.
Apple just doesn't give a fuck any more about their software.
My gripe is that iCloud Tabs haven’t worked right for years. Everything else that syncs in Safari works perfectly fine: tab groups, bookmarks, reading list. But iCloud Tabs, the feature that shows what you have open on other devices, is always either empty or showing things I had open literally months ago.
It works for me. But randomly not work. And I have seen that iCloud Tabs issue before. I think it was logging out and logging back in would fix it. But this will cause another issue but I cant remember what it was.
Basically the whole thing with Sync is very fickle.
On another note, Safari somehow doesn't work well when you have over 128 Tabs.
Every once in a while I think „There is no public bugtracker for closed source software — wouldn’t it be great to have something like Github issues, but for all the software that is developed behind closed doors?“
Like, at least we had a central place to vent about the exact same stuff you just listed, and who knows, in the best case, at least some companies might feel shamed into picking up issues with the most upvotes or see it as a chance to engage with their userbase more directly.
Or I‘m naïve and the most likely outcome is getting sued?
What do you think?
I think the risk is that unless people think that reporting a bug there might actually cause it to be fixed, few will bother to report bugs and you'll end up with mostly people just venting, thus perpetuating the cycle.
> "Does all software work like shit now?", because I swear, this is not just Apple. Software in general feels more bugged as a new norm.
I think this is just the result of an optimizing game placing profit above all else (including quality and user satisfaction) which is indeed the norm in this late stage of capitalism. You want to opt out of that? Good thing the GPL opened the way placing human freedoms front and center, and not-for-profit software stacks like KDE (for instance) keep getting better and better over time.
I use commercial OSes at work by obligation, and the turning point from which my experience as a user became better served by free software happened many years ago.
Don't get me started on Google Home. It was working good-ish for years. Lately it started to respond with "sorry, I didn't understand" no matter what I asked, happily doing it the 2nd time I asked. It became unreliable which is ironic because I can build this tool by myself now in a 24h hackathon using basic openai/anthropic apis..
Maybe we should introduce Mean Time Between Annoyance (MTBA).
Many of my appliances (dish washer, coffee maker, …) work just fine for weeks before an annoyance pops up („deep clean“, for example). Many of my applications do not. For most I could measure MBTA in minutes. Definitely with Spotlight.
https://www.palladiummag.com/2023/06/01/complex-systems-wont...
I mean, just to consider TV alone (thankfully I do not use one), it takes a while for it to start up, and we are talking about a modern, new TV. Old TVs started immediately. I told my grandma to press the button and wait a bit, before trying to press the button again.
Am I the only one only who is satisfied with Mac OS X? I use Windows from time to time and as far as I can tell it is much worse when it comes to random updates and UI quirkiness.
Mac OS X is fine, that would be snow leopard for example =)
macOS on the other hand, is getting worse, I can definitely concur that spotlight is getting more and more useless. Time Machine as well. It mostly doesn’t work for me, always breaking, hanging…
You can be happy until you're hitting a bug that severely impedes your workflow. And then you might feel annoyed when they refuse to fix it for years, and there's no recourse because it's closed software.
Generally I am pretty happy with macOS and I still believe it to be the best option for a desktop. Where I'm getting frustrated is the increase locked down nature of the OS. I get that it's for security, and that's fine for my dad, but it's starting to get in the way of me down my work.
So when you already start feeling like the operation system is preventing you from doing the things you need to do, then all the small cosmetic flaws seems more in your face.
I'm done with macOS, I've migrated to Linux for my general purpose computing. With every new release of macOS, Gatekeeper is becoming harder and harder to bypass, increasing Apple's control over what software can be run on macOS, forcing apps to be signed with an Apple Developer ID. While I'm happy they are taking security seriously, I'm seriously creeped out that macOS sends hashes of every executable I run to their cloud. It's starting to feel like a broader move away from the openness of personal computing and towards a more controlled, appliance-like software experience.
When Sequoia eliminated the ability to override Gatekeeper by control-clicking, it became clear to me that Apple is now employing a frog boiling strategy towards their ultimate goal -- more control of the software you can run on their hardware.
My group makes a custom executable to reflash a hardware device we produce. We build it for Linux and Darwin.
Trying to get the program to work with our Mac users has become harder and harder. These are all internal developers.
Enabling developer mode and allowing Terminal execution isn't enough. Disabling the quarantine bit works - sometimes - but now we're getting automated nastygrams from corporate IT threatening to kick the laptops off the network. I'm exhausted. The emergency workaround, which I tell nobody about, is way less secure than if they just let us run our own software on our own computer.
> emergency workaround
I once really urgently needed `nmap` to do some production debugging ASAP. Unfortunately, the security tools would flag this immediately on my machine, as I knew this from previous experiments. Solution - compile my own binary from sources, then quickly rename it. I assume that this "workaround" was totally fine for sec department. At least production got fixed and money kept flowing.
> At least production got fixed and money kept flowing.
You were denied the tools to get your job done. You've put yourself at risk by applying an unapproved workaround.
Never ever do this (unless you hold substantial shares). Let the company's bottom line take the fall. If that's the only thing they care about, that's your only way to make the problem visible.
Unfortunately the real world isn't black and white. Yes, according to the company policies, I should watch the world burn and do nothing, while looking at the company bleeding money due to customers SLA being broken. Of course, after submitting a ticket to get nmap approved, which takes days. Extra points if I'm on oncall, then racking that sweet incident money is great.
But the underlying SRE culture here is that, if you know what you are doing and have a functioning brain of a responsible person, you'd be forgiven a jump over the fence, if it means putting out a fire on the other side of it. We aren't kids.
There’s a middle ground. Get the appropriate stakeholders involved in the decision, including security. Let security be the ones to keep the system down, if it cones to that. Or, let the business operations folks make the decision to go over security’s head. Either way, this is not something an engineer tasked with fixing an outage should be making the decision on.
> this is not something an engineer tasked with fixing an outage should be making the decision on
I don’t get this at all.
I’d much prefer a team of highly empowered and highly responsible engineers than impotent engineers who need hand holding in case they make a mistake.
Well, good thing that wasn’t what I suggested.
Engineers _should_ have leeway in how they resolve issues. As I read, though, you have a company policy which explicitly disallows the action you needed to take to fix the problem (if I misread, my apologies). Getting the stakeholders involved is the responsible thing to do when policies need to be broken.
Ideally, the way this kind of situation gets handled should be documented as part of a break-glass policy, so there’s no ambiguity. If that’s not the case, though, the business should get to decide, alongside the policy maker (e.g.: security), whether that policy should be broken as part of an emergency fix, and how to remediate the policy drift after the crisis.
If you’re all tight enough that you’re allowed to make these kinds of decisions in the heat of the moment, that’s great, but it should be agreed upon, and documented, beforehand.
Well I found out the hard way that company culture or values can mean nothing if you don't CYA. Granted, the shop was small enough that our team was in charge of both the security policies and ops, but still, on one unfortunate occasion I've stepped outside my area of responsibility to "do what's right" and got punished. The next time I've been in a similar situation - well, I've walked away from the fire and grabbed the popcorn.
By the way, I'm still burnt out. This work is stressful. Don't let it take away what's already scarce for you.
xattr -cr <file> should clear the "download" extended attribute and make it as if the software was compiled on the machine itself, bypassing the ever-so-annoying Gatekeeper.
For binary patching: codesign --force --deep -s - <file> (no developer ID required, "ad-hoc signing" is just updating a few hashes here and there). Note that you should otherwise not use codesign as it is the job of the linker to do it.
Very aware of the attributes, unfortunately these machines are on a global corporate network so there are layers and layers of monitoring software to prevent internal and external attacks. Changing perm bits on an OSX executable is instantly noted and sent upwards as a possible security breach.
Last time we did this I had to spend a week explaining to management that Macs could actually run software other than PowerPoint and it was necessary for our job.
The local workaround that we use is to just spin up a Linux VM and program devices from there. The less legal workaround is using WebUSB and I'm afraid to even tell the necessary people how I did it, because it's sitting out on a public-facing server.
There's extensive documentation. Examples:
https://developer.apple.com/documentation/security/code-sign...
https://developer.apple.com/documentation/security/notarizin...
There are dedicated sections of the developer web forums:
https://developer.apple.com/forums/topics/code-signing-topic
https://developer.apple.com/forums/topics/code-signing-topic...
...and there's an apple developer support person, Quinn, who appears to be heavily if not solely dedicated to helping developers do binary signing/notarization/stapling correctly.
They have written a slew of Tech Notes about signing and notarization. Main TN is at https://developer.apple.com/documentation/technotes/tn3125-i...
Quinn also has their email address in their sig so people can just reach out via email without even needing an Apple account, or if they prefer more confidentiality.
I mean, come on.
As someone who actually signs, notorizes and distributes desktop apps for macOS, I can safely say their documentation is less than ideal.
Maybe because I'm using Electron framework which makes things more complicated, but I don't really understand why there's is a difference between different types of certificates (Developer ID, Apple distribution, macOS distribution) and I had to guess which one to use everytime I set it up.
Also why is notorization a completely different process from code signing, and requires completely different set of credentials from it. Seems odd to me.
> Also why is notorization a completely different process from code signing
Because they do completely different things. Signing is a proof that you were the one to write and package that software; notarisation is an online security check for malware. If I recall, you still sign but do not notarise when distributing to the Mac App Store.
Ok, so the certificate used to sign the package is generated by Apple, why can't I just use that to prove my identity for notarization?
Or maybe simpler, why can't Apple just do code sign and notarization with one single cli call, with one set of credentials?
Google Play does this under the hook, I don't even think about it. iOS is similar, Transponder app does everything in one go.
A lot of developers (including myself) don’t want to notarize/sign their binaries that they want to run on their own machine(s).
OMG, this. I was working on a tool to help integrate password managers on macOS and I got completely blocked by the notarizing requirements. Some things literally cannot be built for macOS as open source software, now.
I don't really think saying documentation exists says much when Apple is notorious for having documentation that's either borderline or downright useless. It's generally the norm that some random blog post from a decade ago is more useful than their documentation, and I say this from firsthand experience.
Can you sign and notarize your own software made for internal use with your own infrastructure? If so, then this is a valid response. If not, then this is an irrelevant response because the issue is going through Apple, not the process being difficult or undocumented. If I own the device, then I should be free to decide what the sources of authority over it are.
Edit: I haven't tested it yet, but it does seem that you can sign an executable with your own certificate (self-signed or internal CA-issued) however you can't notarize it. Right now, notarization is only required for certain kinds of Apple-issued developer certificates, but that may change in the future.
Anecdotally, I was not able to find any way to notarize software for internal use, without paying for a $99 developer account. Though I would have been willing to pay, I know that others who might want to build the software wouldn’t, so I abandoned my project. I suppose I could have maintained it as open source with the developer account required to build, but it seemed disingenuous to me at the time.
> I mean, come on. Is that really necessary? Obviously there are enough people who did not know about, or find helpful, the resources you’re referring to, that we have people complaining on Hacker News. This isn’t exactly a novice’s forum. Perhaps the problem lies with visibility and accessibility of the support resources, rather than all of the people who have seen notarization as a hurdle to getting real work done.
btw, for those who don’t want to search, Quinn’s signature states:
“ Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com"
I understand that you're doing it on principle, but for a software development team, 99$/year is a really minuscule price to pay to be able to build / notarise / distribute software.
Developers pay exorbitant amount of money for much lesser value, and the idea of putting your teammates at risk to stick it to apple is kind of sad bordering with negligence from a business POV.
The principle is what matters. The amount is not the issue. The issue is that there is a cost at all. "It's so cheap" is never an excuse for charging for something that should be free. In this case, running software you have no intent to charge for, on your computer. It's as if someone started charging $0.01/month for breathable air. "But $0.01 is trivial," would not excuse it.
It costs money, and isn't free, for a reason you're not acknowledging. I don't think it's a major profit center for Apple.
It's about setting a higher floor for malicious actors than "random botnet residential IP + a captcha solving service". It's about proving some semblance of identity through a card number and a transaction that goes through without a chargeback.
As the case upthread shows, there's plenty to dislike about a system that inhibits running code built for personal use. And it's obviously neither foolproof nor without collateral damage. Reasonable people can debate if it's worth it. But it still ought be acknowledged that the motivations are closer to the reason you have to identify yourself and pay a nominal fee to drive a vehicle on public roads.
I don't buy it. Or rather, I am willing to believe that some team at Apple has convinced itself that this makes sense, but they're wrong.
In particular, the security boundaries are nonsensical. The whole model of "notarization" is that the developer of some software has convinced Apple that the software as a whole (not a specific running instance) is worthy of doing a specific thing to the system as a whole.
But this is almost useless. Should Facebook be allowed to do various things that can violate privacy and steal data? What if the app has a valid reason to sometimes do those things?
Or, more egregiously, consider something like VSCode. I run it, and the fancy Apple sandbox helpfully asks me if I want to grant access to "Documents." The answer is really "no! -- I want to grant access to the specific folders that I want this workspace to access", but MacOS isn't even close to being able to understand that. So instead, one needs to grant permission, at which point, the user is completely pwned, as VSCode is wildly insecure.
So no, I really don't believe that MacOS's security model makes its users meaningfully more secure. At best, the code signing scheme has some value for attribution after an attack occurs, but most attacks seem to involve stolen credentials, and I bet a bunch just hijack validly-notarized-but-insecure software a la the VSCode example.
Notarization is not a trusted system on macOS - or rather, notarized binaries still have a "this was downloaded from the internet" prompt, and the user is meant to make a decision on whether it is trustworthy.
Notarization does some minimal checks, but is mostly about attaching a real identity so that maliciousness has at least some real-world consequences. The most obvious being that you lose the ability to get more apps notarized.
> But it still ought be acknowledged that the motivations are closer to the reason
Since this isn't true, no acknowledgement required, it doesn't need to be a "major" profit center to magically become a benevolent feature
Actually the cost is not the issue (you are paying for it one way or the other), the issue is the authorization to do such an action on your (supposedly) own hardware.
The commenter I replied to employed by a business, develops software and distributes it within a team of engineers with Macs.
For your personal needs, you do not need to pay anything for building and using apps locally.
Untrue. There are specific APIs which require notarization , regardless of usage. Credential autofill is one such API.
Adding signing as a requirement can easily make what was once a very simple distribution mechanism into something much more complex - now you need manage signing certificates and keys to be able to build your thing.
The cost is far far higher than the price.
But it doesn't in practice.
I develop and distribute few free apps for macOS, and building / notarising is never a problem.
In contrast to this point, as long as I use Xcode and do the same thing I've always done allowing it to manage provisioning and everything else, I don't have a problem. However, I want to use CI/CD. Have you seen what kind of access you have to give fastlane? It's pretty wild. And even after giving it the keys to the kingdom, it still didn't work. Integrating apple code signing with CI/CD is really hard, full of very strange error messages and incantations to make it "work".
I don't know about fastlane, since my CI/CD is just a shell script, and signing and notarising is as hard as (checking the script) running `codesign ...` followed by `notarytool submit ... --wait`
Yes, you need to put keys on the build server for the "Developer ID Application" (which is what you need to distribute apps outside of AppStore) signature to work.
You do not need to give any special access to anything else beyond that.
Anyway, it is indeed more difficult than cross-build for Darwin from linux and call it a day.
Do you distribute OSS software which requires notarizing? If so, have you found a way to let the community build the software without a paid developer account? I would be very interested in a solution which allows OSS development, relying on protected APIs without requiring that anyone who builds the app to have a paid developer account.
You seem to be comparing a single dev sending apps to the world vs a corporate team pushing to employees (if I get parent's case right).
In most cases, just involving account management makes the corporate case 10x more of a PITA. Doing things in a corporate environment is a different game altogether.
Code signing is absolutely disgusting practically and philosophically. It has very reasonable and good intent behind it, but the practical implementations cause great suffering and sadness both for developers (cert management, cost, tools) and end-users (freedom of computing).
It is ugly: https://hearsum.ca/posts/history-of-code-signing-at-mozilla/
[flagged]
I take it you feel the trade off for dev team inconvenience, vs end user security, is not worth it?
They're talking about internal software for internal users. It can be made insanely secure, but that surely isn't the primary concern in this case.
I'm just observing that the cost is a lot higher than $99/year.
I do this professionally, I maintain macOS CI workers for my employer. Apple doesn't make it easy.
The tool is built deep in our CI/CD chain. The whole thing is a house of cards built on a massive pile of tinder next to an open drum of kerosene. You want me to integrate XCode into that?
Last time I tried setting up an Apple developer license inside a large corporation, one that they paid for and not tied to me or my credit card, it was also a nightmare.
And yes, it's also on principle.
Who said anything about Xcode? The codesign tool is part of macOS, not Xcode. The CLI tool for notarization is bundled with Xcode, but you don't have to use it; they have an official REST API that you can use directly.
Do you have any notes on how to run it inside a Gitlab pipeline via a Linux Docker instance? I'd love to learn how to do this, then.
Sure it's trivial, but it is tacit acceptance that you need permission to make a program on their platform. Permission that needs to be renewed year over year. Permission to earn a living on this platform.
Permission that can be revoked for any reason, including being compelled by someone with more power than Apple.
It is permission to _distribute to others_, you can build and run on your own computer without a problem.
Once signed, binary will work forever, you only need active subscription when you need to re-sign / re-notarise.
They can revoke a signature too.
If your signature is compromised and you start signing malware then yes. That's the whole purpose of it.
Do you have any evidence that it happened in any different circumstances at least once?
$99/year for one of the basic uses of a computer isn't okay.
Distributing software is not what I would call one of the basic uses of a computer.
I migrated to Linux about a year ago too. Not the smoothest experience ever (looking at you, ath11k with device-specific quirks) but so far I am delighted. Finally, I don't have to fight my computer to do things I expect it to do.
Unfortunately, I still have to deal with macOS for work due to corporate policies.
The main problem I had with living in a Gnome desktop environment, is with the keyboard. I'm not willing to abandon my use of Emacs control+meta sequences for cursor and editing movements everywhere in the GUI. On macOS, this works because the command (super/Win on Linux/Windows) key is used for common shortcuts and the control key is free for editing shortcuts.
I spent a day or so hacking around with kanata[0], which is a kernel level keyboard remapping tool, that lets you define keyboard mapping layers in a similar way you might with QMK firmware. When I press the 'super/win/cmd' it activates a layer which maps certain sequences to their control equivalents, so I can create tabs, close windows, copy and paste (and many more) like my macOS muscle memory wants to do. Other super key sequences (like Super-L for lock desktop or Super-Tab for window cycling) are unchanged. Furthermore, when I hit the control or meta/alt/option key, it activates a layer where Emacs editing keys are emulated using the Gnome equivalents. For example, C-a and C-e are mapped to home/end, etc.
The only problem is, this is not the behavior I want in terminals or in GNU/Emacs itself. So I installed a Gnome shell extension[1] that exports information about the active window state to a DBUS endpoint. That let me write a small python daemon (managed by a systemd user service) which wakes up whenever the active window changes. Based on this info, I send a message to the TCP server that kanata (also managed by a systemd user service) provides for remote control to switch to the appropriate layer.
After doing this, and tweaking my Gnome setup for another day or so, I am just as comfortable on my Linux machine as I was on my Mac. My main applications are Emacs, Firefox, Mattermost, Slack, ChatGPT, Discord, Kitty, and Steam. My Linux box was previously my Windows gaming box (don't get me started about frog boiling on Windows) and I'm amazed that I can play all my favorite titles (Manor Lords, Hell Let Loose, Foundation, Arma Reforager) on Linux with Proton.
[0]: https://github.com/jtroo/kanata
[1]: https://github.com/hseliger/window-calls-extended
Love this, and I'm in the same boat. Is your configuration of kanata public at all?
I know it's mostly muscle memory, but macOS shortcuts just seem sane and consistent and that has been one of the biggest frustrations when trying to switch. I found toshy[0] which does something similar - did you try that? The goal is purely macOS key remappings in Linux, so a much smaller scope than kanata.
[0]: https://toshy.app
I didn't try toshy, I had a bad experience when I tried kinto.sh a couple of years back, and I had a pretty clear idea of how I could get what I wanted out of a fully featured keyboard remapping tool under Linux. I initially started with Kmonad, but once I found Kanata, and realized that it had a TCP interface for programmatically changing layers, I quickly switched.
I have a Kinesis 360 keyboard, and my config[0] probably won't work for other keyboards, but it can give you a starting point for your own config.
[0]: https://gitlab.com/spudlyo/dotfiles/-/blob/master/kanata/.co...
I'm convinced a DE that figures this shit out out of the box will explode in popularity. Super for the OS and DE shortcuts. Ctrl for the Terminal and readline cursor movements. It can't be impossible to bake these in as defaults.
The hashes are completely anonymized and not that intrusive. I'd rather they do it that way and have a global view of possible malware attacks than the complete free-for-all that other platforms "enjoy".
But here's my (unpopular) take as a GNOME user and using Fedora immutable distros + flatpaks -- I suspect Linux is going to go in a broadly similar direction. Maybe not soon (even flatpaks aren't universally acclaimed), but sometime.
It doesn't matter whether it is anonymized. Apple has no business collecting information about what executables I am running on my own computer, or even whether I'm running executables at all. I don't care what their stated purpose is. I don't care what they want a "global view" of. It's my computer, not theirs.
I don't even mind that they've introduced a level on the totem pole that's above root. But on my computer, -I- should be the one at that level, not Apple.
> it's my computer, not theirs.
the issue seems to be that you still believe this?
to downvoters: you can think its not fair that apple effectively holds control of your device, true, but the only way you can change things is to not buy the products. If you buy it, you accept how it is. vote with your wallets, not in some internet forum
I think it depends on what distro you're talking about. Corporate distros like RHEL and SLES are absolutely going that way. It takes a lot of effort to backport fixes, and the money's not there in desktop Linux to make it worth their while if containerization is a viable alternative. Red Hat's gotten rid of a bunch of graphical applications for RHEL 10 and stated that users can get them from Flathub as an alternative. I believe there was some consternation when CentOS Stream 10 launched without even a packaged web browser and the advice was to install Firefox from Flathub (there's a lot of use cases where that breaks stuff), but it appears they've walked that back and started providing Firefox as a traditional package.
However, less corporate distros that mostly just ship built upstream software as-is since they don't have to support it for long periods (think Arch, Fedora, Void, etc) don't have that problem, so I expect we'll continue seeing them use traditional packages.
> I believe there was some consternation when CentOS Stream 10 launched without even a packaged web browser and the advice was to install Firefox from Flathub
Ubuntu does the exact same thing with their snap repository, the Firefox apt package from Ubuntu is fake. At least Flatpak is a community-led project unlike snap.
My advice is to add Mozilla's apt repo for Firefox by following the handy guide on their website. It's pretty short and easy (copy and paste).
My advice for web browsers is to use Flatpak.
You can limit the file system permissions of the app, like giving only access to downloads, so that if/when there’s a sandbox leak you’re fine. You can also disable various things, like webcam or mic, this way.
In addition, you can get perpetual updates to the latest version of your browser even on old, stable distros like Debian.
Running a new browser on an old distro would be a strong reason for me (if I somehow couldn't update the distro - but I can and I do.) Regarding security, the added work and complication outweighs the added security for me. I can't really disagree with having a different preference. More security on this wild internet is better, right?
IMO it's not much added work. In KDE you can navigate to settings and edit flatpak permissions, and flatpaks are available to download via discover. I haven't noticed any weirdness for firefox or chrome.
"Immutable" distros? We used to live-boot those from optical media back in the day. Fedora is quite late to the game.
> I suspect Linux is going to go in a broadly similar direction.
Linux is pretty diverse, there are still distributions out there that haven't adopted systemd.
I understand and appreciate the sentiment, but I see the intent very differently. Apple is not employing a frog boiling strategy, but rather being responsive to an increasingly sophisticated adversary.
It’s like criticism of the quality of Google search dropping. It has absolutely tanked, but it’s not because the algorithm is worse, it’s because the internet has grown orders of magnitude and most of it uses the same hyper aggressive SEO optimisation, such that the signal to noise ratio is far worse than ever before.
It is because the algorithm is worse. So many garbage results are showing up which they continue to allow.
Kagi lets me completely block specific domains. If Google cared about quality they’d let you do the same.
You can also block specific subdomains, too. Useful when I want to be able to see finance.yahoo.com items in my search results, but nothing else from the yahoo.com domain.
> being responsive to an increasingly sophisticated adversary
"Those who refuse to give up essential Liberty to purchase temporary Safety deserve to have to deal with the GNOME desktop user experience."
I miss macOS sometimes.
That rationalization ignores a lot of confounding evidence, such as other search engines being able to deliver great results and adequately keep the SEO garbage out.
That’s kinda the SEO equivalent of security by obscurity though, right? SEO spam puts a lot less effort into optimizing for other search engines, whereas Google is dealing with being the primary target of every adversarial SEO spam site.
This is a great theory but it isn't the reason. Google management made a conscious decision about five years ago to prioritise profit over search quality.
We know this because the emails came out in discovery for one of the antitrust suits.
The biggest struggle is that the original Macintosh was so simple to manage. The original concept of system extensions to expand the capabilities and the file structure built on the hierarchy with the desktop as the top level was broken with the shift to Unix.
Suddenly the users file hierarchy started wherever the Home folder was located and it became an island of user controlled environment surrounded by complexity of computer operating systems.
The result I found overall well thought out but when the desktop became just a folder I felt the Mac moved from it’s simplicity embracing the complexity that was offered by windows.
Simplicity is fine for a hobby project. An operating system having zero concern for any kind of security is a non-starter today.
It's amazing the rose tinted glasses people have about the original Macintosh environment. It was insanely janky and (unless you were ruthlessly conservative) insanely unstable by today's standards. By version 10.5 (Leopard) the modern UNIX-based MacOS was unequivocally superior to Classic MacOS in every metric other than nostalgia.
I understand the trade offs and accept them. I was trying to point out where the split is and how it won't go back. I think the point of view expressed in your comment is s distorted as the ones your derding.
I also believe that the simplicity could have security as performant. The real advantage of the Unix layer is compatibility that the Macintosh was missing.
I sincerely tried to interpret what you meant here, but I failed. I understand the words, and the fragments of every sentence, but I wasn’t able to deduce the intent of your reply.
Are you trying to say that it’s possible for a system to be both simple and secure? Absolutely that’s the case, but with a trade-off — either it needs to restrict the user’s freedom, or be fully disconnected from the outside world.
I have pondering these ideas a long time and what is needed is an intense glossing over of all the details. The original Macintosh did exactly this and was called a toy and with 128k completely useless. Alternatively my unsophisticated Mom saw the Mac 128k demoed at the mall and went into a frenzy to get that tool. She wanted to publish documents.
The threats in the world are real and the internet doesn't help. I 100% agree that a network connection needs to be kept at a distance to make things simpler.
I think the power of language used to describe a system is where simplicity begins.
What I'm working on is creating a crisp line of delineation between "local" and "public" networks.
If by default after is on the "local" network auto-discovery is secure. If things are explicitly needed a user can publish them through physical manipulation to publish to the outside world.
The outside world can now be described using classic Users and Groups which is cultural easy to understand.
I'm trying to create an environment that focuses on making those 2 things plus a third element simple to understand and physically manipulatable.
The freedom I'm looking for is available on the "local" network. The "public" network is where our data is interchanged with the outside world based on our publishing. I don't expect people to interact with this layer much. I expect people to configure it for whatever institution/organization/government.
Most of the complexity I see in computing these days is market drive demand for eyeballs/clicks/...
> Apple
Actively depleting the good-will they accumulated over the years definitely makes it worse. It's that harder to give the benefit of the doubt to a company also showing the middle finger to their Devs.
> Google
Giving priority to AdSense sites, fucking around with content lengths (famously penalising short stay sites), killing advanced search options. That's just thinking about it for 10s, but to me most of it is totally of Google's making.
As someone who runs a decent sized site with AdSense, I wish.
Of course Google's algorithm is worse. Google prioritises showing you search results that make money for Google. Google has no incentive to show you anything else.
I can't believe I even have to say this out loud. Look up enshittification.
What used to be a powerful, user-respecting OS is increasingly starting to feel like an iOS cousin with training wheels
What OS are you talking about?
If Linux, you'll have to be more specific, because users don't use Linux. They use Android, Ubuntu, Gnome, pop!os, redhat etc.
They're clearly talking about macOS in that comment.
> Gatekeeper is becoming harder and harder to bypass
sudo spctl —-master-disable
Frustrating thing is the earlier versions worked well, it protected you from accidental things but the way to force it was clear and obvious. Now bypass is obtuse and requires enough work arounds people advise just disabling it which is also bad to normalize.
Don't disable SIP, clear the downloaded/quarantine extended attribute instead. This clears all extended attributes: xattr -cr <file> and bypasses the obnoxious GK.
Why remember all these little tricks Apple makes you do to use your own hardware?
Yes because everything in Linux is completely intuitive and you never have to know anything obscure to use it to your liking…
The difference: in Linux it is a usability issue to be fixed, whereas on macOS it is a feature and explicit design goal to make it that way. In general, I have found that things which are difficult on Linux are so because the problem is difficult, not because the people who make my computer have paternalistic attitudes about my usage of it.
> In general, I have found that things which are difficult on Linux are so because the problem is difficult [...].
Hard disagree. Audio mixing is not difficult[1]. The Linux kernel guys were right - it does not belong in the kernel. The userspace story however, has been a complete shitshow for decades. I think Pipewire mostly fixed that? Not sure, sometimes I still have to log out and back in to fix audio.
The funniest part? It's been working in the BSDs all along. I recommend reading the source of sndiod[1].
[1]: <https://cvsweb.openbsd.org/src/usr.bin/sndiod/>
What's even worse? Probably systemd. I try not to hold a strong opinion - I tolerate it, the way I tolerate traffic when taking a walk. The technical issue however is several orders of magnitude simpler - again, the BSDs come to mind, but you can also write a complete PID1 program in about 20 lines of straightforward C[2]. I don't mind the commands being unfamiliar (they're already all different in almost every OS family); it's that the entire package is dreadfully large in scope, opaque, and I find it more difficult to interact with than anything else in this landscape.
[2]: <https://ewontfix.com/14/>
I agree PulseAudio, Pipewire, ALSA, etc. are a pretty big shit show in Linux and have been for some time. From what I understand there are a few stories there with various levels of screw ups, but at no point was this situation the goal, and we are moving closer to an easy to use system that "just works" for these needs.
However, it's worth noting that audio experts doing high grade mixing in production are using these systems quite effectively and have been for a long time. It's similar to Blender in that regard with it always having the "guts" of doing great things, but only the experts that knew the correct spells to cast were able to use it effectively before the UI/UX was improved with 2.x and later I believe.
I work in live media production. I would never consider doing any mixing on Linux - just like I wouldn't consider putting Docker containers on a Mac to serve live HTTP traffic.
There are indeed always exceptions to generalizations, as you've pointed out. Though pulseaudio always trudged along fine for me (not like audio had always worked for me on other systems), and pipewire works perfectly.
So it’s purely ideological without any real world difference?
Are most people better off with Apple defaults?
And it’s not because the problem is “difficult”. It’s because for 20 years it has been claimed that this will be the “year of Linux on the Desktop” and it’s never been good enough for most people.
It’s perfectly fine. KDE and Gnome are both now more cohesive, more intuitive, and less buggy than either Windows or MacOS.
The problem with Linux is that, while it’s very good, it’s different.
Nobody actually cares how intuitive something is, at least not in absolute. People will still say Windows is intuitive. Pretty much nothing in Windows, from the registry to COM to IIS to setting/control panel/computer management, is intuitive. But they know how to use it and are used to that particular brand of buggy inconsistency.
Linux desktops have been high quality for a long time now. The reality is you, and others, measure quality as “how much is it like windows” or “how much of it is like macOS”. When that’s your metric, Linux will always come up short, just by definition.
If I pick up a Linux laptop right now, how well will it handle low latency audio? How well will it handle power management? My graphics hardware? Getting WiFi to work reliably is still an issue.
Can I get a high performance Linux laptop with good battery life, fast graphics, that runs cool and silent?
Yes, I just bought one a few months ago actually. A new lunar lake laptop. It gets 12 hours of battery life and has plenty performance for programming, plus 32 gigs of ram. It’s under 3 pounds and the screen is OLED.
And yes, everything works. On bleeding edge 2 month old hardware.
I even use thunderbolt 4 to connect my external displays and peripherals. Not only does it work, but it’s pleasant. KDE has a settings panel for thunderbolt. I can even change my monitor brightness in KDE settings. No OSD required!
But wait, there’s more! I’m running 2 1440p monitors at 240hz and the system never even hiccups.
But wait, there’s more more! The battery settings are really advanced so I can change the power profile, maximum charge, everything.
The only thing I’m unsure about in your comment is “low latency audio”. It seems low latency to me, but I’m not an audio engineer.
Yes, but be mindful of the hardware you're using
What's high performance for you?
I can certainly get a Framework (Fedora and Ubuntu officially supported), throw my prefered Bluefin-Framework image in and get working
Battery life around 7 hours is the average I see reported, Fast/Silent will depend on the model, but I don't see the issue really Upgradability and easeness of battery replacement are a plus
I just picked framework because they were first to come to mind, but I think Dell has a nice Linux story, Tuxedo also comes to mind
7 hours battery life is less than half of what I get on my MacBook Air. That wouldn’t last me on my ATL - HNL flight I took last year or my MCO - LHR 10 hour flight I’m taking this year.
These are the typical reviews I see around the Framework
https://community.frame.work/t/fw-16-review-the-good-the-bad...
Poor battery life, heavy, runs hot, poor build quality, bad speakers, and decent but not great graphics.
> Getting WiFi to work reliably is still an issue.
This should not be an issue. I have hardware that varies a lot and I literally buy random wifi dongles for $1, $4, $5, Amazon, AliExpress, etc. and they have all just worked on first plugin. I can easily take my phone and tether it to my PC using USB-C and it appears in my Gnome network list and just starts using it for Internet.
> how well will it handle low latency audio
Pretty well you can use OBS to verify this. There are plenty of settings if you want to tune that.
> My graphics hardware?
Just ignore Nvidia and move on. Sure they might figure it out one day, I gave up a decade ago and I use Intel integrated or AMD dedicated for GPUs. Nvidia does "work" for most purposes but it will cause you a headache eventually and those are not worth $400 to me.
> How well will it handle power management?
I enjoy the basic controls that Gnome provides that give me a simple way to basically say "go all out" or "save some battery" etc. There are finer grain controls available and I have used commands in the past to push crappy hardware to it's limits before I chucked it (old Intel iGPUs)
> Can I get a high performance Linux laptop with good battery life, fast graphics, that runs cool and silent?
You can get ones that are specifically marketed for this purpose. Tuxedo is one that specializes in this and obviously System76 also do. These have a higher price point than a regular Dell system, which IMO is the better option in some ways. Dell sells more systems and has more users and it will "just work". They sold Linux systems for years and still do I believe.
Regarding "running silent" this is a gripe I have, not that it runs loud but some laptops have custom RGB crap and sometimes in Linux I don't have access to the extra functionality to customize my lighting or manually set my fans to ramp up etc. There are projects that aim to do this, but I have not looked into them beyond the most basic `fancontrol` built in command.
> Are most people better off with Apple defaults?
I think once you expand the scope to "most people" it might become impossible to say what the correct answer for that large of a group is. In the past their value add might have been more compelling and their feature lock not as draconian. It appears some people think that has changed over time.
That isn't the denotation of my post; I was not characterizing Linux as a whole, but only responding to your specific (unsound) analogy. It works better for me, for a number of reasons including that above. Perhaps it will work better for you, as well. :)
The second part of your post is incoherent to me, I can't tell what you're trying to say.
I never mentioned Linux. I'm curious why people want to pay for this component of their product from Apple.
stockholm syndrome
I don’t know if it’s just me, but i want more Gatekeeper, not less - help me stay safer. Or is it a security theatre? Malware producers can sign things just fine?
An OS that won't let you do what you want to do is malware.
It is indeed starting to feel like that.
The more Gatekeeper, the more used people get to clicking OK without considering what it means. No amount of software can prevent the social engineering of an actual malware that tells the user to just click that OK button that they already have to do on a regular basis. Less is more here. It's why Windows tuned down their UAC after Vista.
It is not a consent prompt. You get a choice on whether to trash the binary or quit.
To run a non-motorized app requires you to open a separate app, navigate to the security section and select that you want to authorize the app to run.
Apple does not have any desire to make distribution of non-notarized binaries commercially viable.
And we've seen this change across all browsers. There no longer is a "continue" prompt for TLS issues. The result is, way fewer maintained sites go months with an expired certificate.
> clicking OK without considering what it means.
Predefined value on current macOS's Gatekeeper is "move to Bin" instead of OK. Other option is Done - which cancels opening action. If you want to bypass that, you need to go to system settings > privacy & security and manually allow particular app there.
Who know what later updated will bring.
Safe from what, exactly? Is this one of those state-actors-are-after-my-cat-photos delusions?
> I am not suggesting Apple has fallen behind Windows or Android. Changing a setting on Windows 11 can often involve a journey through three or four different interface designs, artifacts of half-implemented changes dating back to the last century. Whenever I find myself stuck outside of Appleland, I am eager to return “home,” flaws and all.
Hard agree with this. I sometimes have to boot up a windows laptop to play Minecraft with the kiddo, and it never stops reminding me how little I know about Windows now, how counter-intuitive everything is, how everything feels designed for a user whose mind I cannot comprehend.
To be fair, win11 is a nightmare in terms of usability. I can only assume a committee of eldritch beings and accountants designed it.
It blows my mind that when right-clicking on a file in file explorer, the 'delete' option is hidden in a sub-menu under 'more options'.
I was fully braced for Windows 11 being awful when I installed it recently but that hasn't been my experience at all. If anything it's just a slightly more polished Windows 10.
Probably helps that I installed the IoT LTSC version, but still, apart from the task bar being stupidly in the middle (thankfully there's an option to move it to the left), I've had zero issues.
I even added a network printer and it found it quickly, and added it quickly and successfully, which is a feat I don't think I've seen happen on any OS ever.
The context menu is a clear improvement on the old one (which you can still get to with one click).
Windows 11 can be usable if you run this debloat script [1]. Of course, with every update it's a constant game of cat and mouse.
1) https://github.com/Raphire/Win11Debloat
I just tested it. It's in the first row, last item. [Cut | Copy | Rename | Share | DELETE ]
Out of old habit I always use shift + DEL key and did not notice it's in the top row now.
As someone who stopped using windows about 7 years ago, and only recently used it last weekend, my eyes probably glossed over the fact that some buttons were laid out horizontally.
It also makes way more sense.
Are you sure about that? Look for the trashcan symbol on the upper-right of the context menu.
I agree that having "more options" to begin with was a jarring experience coming from windows 10 though.
Except for when the placement of the icon strip with the trashcan symbol changes to the bottom of the context menu because of the location of the context menu on the screen. Bonkers. No idea why the UI committee would’ve okayed that one.
Oh that's so weird :\
You're right. I just checked. That's odd.
It's deliberate. It's the good-bad-good-bad release cycle Microsoft insists on. Windows 12 will be decent, then 13 will be horrible again.
Ah, so it's like Star Trek movies.
Yeah the new context menu is horrible. Fortunately it can be set back to classic, I think with a registry edit
I lost Windows fluency around 7. I have little desire to get it back even though I use it every day as a secondary system.
How many "control panels"? How many places are there to adjust audio device properties?
Also, every time you run something in Windows (whether it's part of the OS or an App) it can be a trip down memory lane, UI-wise. Oooh, this dialog is 2015 vintage! This dialog is styled like Windows 8! This one is from the XP era! Ohh, and that rarified dialog has controls that have not been changed since Windows 95!
There's still UI stuff that hasn't changed since Windows 3.1 minus the UI kit updates.
If you want a super bad audio-related journey, try fixing external speakers connected to a Linux box. It's abysmal, and 99% of it can only be done via the CLI. Nothing wrong with that... but for something so normal I expected more ease-of-use.
Around four.
I disagree with that. As an occasional user of MacOS, the new Settings app is quite bewildering. There are just as many dials as in Windows and sometimes requires a trip to ChatGPT.
And for reasons I don't understand, why is the window itself not resizable?
I'm a Windows fan (I actually really like 11) so I'm a bit biased, but I just dove back into macOS since 2014 and the settings app is truly terrible. The built in search barely works and the layout is so damn confusing. God forbid I install some remote desktop software, now I have to go to accessibility settings 5 simes and approve some permission that is strategically buried for only what I can tell as a way to thwart "normies" from enabling something via obfuscation.
It would be fine if the settings available were actually useful or at least could bring me to some tool that does it better. I get no meaningful report of what's eating my batter and why every time I open my MacBook it's dead. And if I want to change the actual resolution of my display I'm given just a list of scaling options pretending to be resolutions. Oh, want to set a specific resolution or refresh rate? You have to do some stupid kinger king foo of option control something _before_ you click on this dialog. I get the criticism about the Windows settings app and legacy power tools (I think this has largely been solved anyway), at least they exist and allow me some iota of control over my computer
To be fair, I agree with you that the recent OS X control panel changes suck shit and are awful, and get worse with every update.
It is resizable vertically but not horizontally as it doesn't make sense to resize the window horizontally considering the content of settings details panel (the right part of the settings window), you would end up with a lot of empty space if you were able to resize it horizontally.
You could say the same thing about the Windows Settings app, but it resizes in every way and it's very much size adaptable. In other words, UI components resize or become visible/invisible depending on the width.
I use both Mac and windows extensively and I'm not sure what are you referring to.
You can access most settings by Windows + "yourquery".
Using search as a UI is admitting the UI sucks.
It is indicative of a failure, not a solution in and of itself.
Just like System Settings in macOS! Always have to use keyword search in that thing.
FWIW, search as a UI isn't a bad thing, Cmd + Space is the main way I launch apps on macOS (or Win + "type whatever").
Search feels to me like a good compromise between memorizing terminal commands (including the correct set of parameters to do what you want) and navigating through a UI to find what you're looking for.
Search is fine as a one-off thing, but if you repeatedly have to use search to find some common setting, that's a clear UX fail.
To be fair, it's hard to say whether the Settings app is more broken in Windows or macOS these days. I think I'd have to give the crown to macOS here on account of search itself being more broken.
Why is it a UI fail? Honestly search as the default way of going to settings is my favorite development in modern OS design, I no longer need to memorize 3-6 deep menu trees to find a trivial setting.
For example:
I prefer keeping my hands on the keyboard, and typing cmd+space followed by mouse is so much faster than finding the right pixels to click through in menu trees when I want to adjust my mouse sensitivity.
I didn't say that search itself is a UX fail. It's not; it's great!
The UI fail is if search is required to find the setting every time you need it, because categorization and/or navigation is broken otherwise.
As to keeping your hands on the keyboard, that's an argument for having proper keyboard support in any UI, complete with hotkeys and shortcuts. The big difference between these and search is that the former is (if properly done), consistent and predictable. So e.g. when the app adds new things in the future, your existing key sequences will still do the same thing they did before.
To take your specific examples, if I do Cmd+Space, "mouse", Enter on my system, it will bring up LinearMouse, not system mouse settings.
Disagree with this. I use the search for everything. It’s just so much quicker than even a well designed UI.
On my iphone, I have one page of apps, everything else in the app drawer, and use the search all the time. It often gets what I want in one or two chars.
Hard Disagree, search is great for anything that is common, but not common enough to justify a shortcut / other accomodations
It also has the benefit of being roughly bilingual (English + Installed language) and being there even in machines not setup for you
I can get my mom's computer, fully set in spanish, and I can win + "query", into settings, programs and tools to setup whatever she needs
then Mac fails as hard as windows. there’s a reason search exists in the settings app on both MacOS and iOS. and there are plenty of settings that require “default write …” or editing some plist file or worse
I'm not sure I agree.
I admit I honestly have no idea where the system settings are located as I haven't pressed the start button in ages, but the same applies to MacOS as I would use spotlight there as well.
Using search as a UI means you can only find things that you know exist, but there are plenty of important settings that I've only discovered by actually navigating through the UI.
Just type "settings" then and you go to the main menu.
The point of this comment thread is that important Windows settings are scattered throughout many different interfaces beyond just the Settings app, and you can never be sure where to find what you're looking for, which results in a poor user experience. Off the top of my head, you have the Settings app, Control Panel, Device Manager, System Configuration, and Network and Sharing Center.
This doesn’t work on Windows because there’s half a dozen “settings” applications, which is the original complaint.
I recently discovered that I can change audio settings on a mac by using the opt+volume shortcut and it takes me directly to the sound panel. Now if I could only make it stay on the built-in microphone instead of always switching to the worse sounding airpods one.
> You can access most settings by Windows + "yourquery".
The search doesn't even work all the time. Sometimes it won't do fuzzy search, sometimes typing "bluetooth settings" will do a Bing search, some other time it will open a PDF, and so on.
It's fine if you stay away from the consumer releases. Windows 11 LTSC (based on 24H2) feels like windows 7. Most of the stuff you had to futz with powertoys and GPOs back then. That hasn't changed. I quite like it. It has been utterly boring compared to my recent Apple experiences.
How come you have to use Windows to play Minecraft? Are you using Bedrock edition?
I... think so? Whichever one works with Microsoft Realms, which is the $2/month solution I settled on after somewhat-getting a self hosted server to run for a little bit on my desktop.
I figured that I make a six-figure salary as a software developer, I can afford $2/month so that I don't have to fucking become a sysadmin for a game server my child depends on.
Just FYI:
There are two editions, Java and Bedrock. Java is the original, available on PC and Mac, and supports programming-like technical play and mods. Bedrock is Microsoft’s reimplementation, available on all devices except Mac, and supports emotes and microtransactions. Other than that they’re largely the same game, and buying either gives you both versions. Realms supports both, but a server is one or the other, not both. There are also other managed hosting providers for Minecraft (both versions), but Realms is probably easier and cheaper for you. Java version has performance problems, but mostly because Microsoft’s code is inefficient, there are a few mods (also written in Java) that everybody uses to fix performance without affecting gameplay.
I believe both versions of the game support realms, although I haven't tried it.
Hey, if we're already complaining about Microsoft products, can someone explain why the Bedrock and Java versions of Minecraft have not been made cross-compatible in the TEN YEARS since the Mojang acquisition?
(... speaking as another dad just trying to play with my kid.)
What does cross compatible mean in this context? They are two different games written in two different languages. I mean, they look like they are the same game, but they are not. Making one compatible with the other is a Herculean task. If not impossible.
I'm talking about network compatibility, so that a Bedrock client can join a Java server and vice versa. It's clearly somewhat possible because GeyserMC[1] exists. It's just ridiculous that it's a third-party addon.
[1] https://geysermc.org/
The games state is handled completely different between bedrock and java
I’d imagine mostly due to a lack of incentive on microsoft’s part. Like minecraft is literally the biggest video game to ever exist with, making 2 entirely separate code bases work while keeping all the features the same and preserving compatibility with over a decades worth of mods just so the mostly separate java and bedrock communities can play with each other is just not worth the risk. So many people play minecraft in so many different ways means that making even minor changes in gameplay can be huge sources of controversy, let alone major infrastructure changes.
They still exist separately today because the modding scene is completely different for them. Minecraft Java is the original and has a huge modding community based on decompiling and patching the game. Those mods are all incompatible with Bedrock because Bedrock is a separate reimplementation of the game for performance or whatever.
You said no word about the god damn candy crush ads. As if we don't have enough sources for cancer and other terminal illnesses
Every article about some issue with Apple MUST also include an anecdote about how you couldn't use Windows one time and how it's still worse than Mac.
It's the rule lest someone think you made a bad decision and you're regretting it. Even though it's an OS targeted for your grandmother, you must not let them see weakness.
At this point it's a joke. Either critique Apple or admit you can't without also bringing up some other OS. It's weird.
[dead]
Agree. Apple needs to clean up shop - MacOS has been egregiously worsening year over year. Some features like Universal Control and Continuity Camera are legitimately awesome, but they do not make up for the INSANELY slow System Settings app that gets harder to navigate with each release and which has >2s wait times for the right pane to respond to a change in the left pane. Steve Jobs would have fired the person responsible for that overhaul three years ago, it's embarrassing. Messages too needs a ground-up rewrite. Getting more elaborate emoji tapbacks doesn't make up for fundamental instability and poor syncing behavior. C'mon!
Absolutely. I love the work they have been doing on the backend, like PQ3 [1], but it just doesn't work for me when the Stickers and Emojis extensions on Mac leak several GBs of RAM and I have to terminate it several times a day to free up memory.
Another thing I dislike is that it stores the whole message history on the device. It's nice to have at times, but I send a lot of photos, which adds up in storage over time. I pay for iCloud, and store my messages there. Why does my Mac need to hold every single photo I have ever sent?
[1] https://security.apple.com/blog/imessage-pq3/
Local iMessage storage is debilitating. I have over 90GB of iMessage history that I don't want deleted. The keep messages for x days removes it from iCloud and the Mac though. Why?
System Settings is awful. Whoever decided to hide tons of settings inside innocuous "(i)" non-buttons should be kept far away from UX design. It's the hamburger menu of macOS.
It's what they have available in the SwiftUI toolbox of "shitty widgets from mobile operating systems" though.
Thankfully, that is also somehow the future of UI frameworks on all of their platforms!
> Getting more elaborate emoji tapbacks doesn't make up for fundamental instability and poor syncing behavior. C'mon!
Oh but you forgot about the “catch up” button they added 2 releases ago that takes you to the last unread message! …
… but only if said last message is within the N most recent messages, in the messages which are already “fetched” from local storage. If it’s more unread messages than that, the button is nowhere to be found.
Like they said “ok we can implement a catch up button but it’ll be hard to solve due to how we do paging.” “Ok we just won’t put the button on screen if we have to page then. Save the hard problem for the next release.” Then they just forgot about it.
Apple used to obsess over details like these. Now it feels like they're hoping we won't notice.
One thing that has been slowly creeping in is a little bit of a Microsoft-like "you will use our feature", like launching apple music every time I hit headphone controls, or nagging me to turn on reactions every time I start a video call. In some ways that's more annoying than the outright bugs, as they could choose not to be that way and market themselves as not being that way.
I feel your pain. I hate pushy upsells and promos. Also the cluttered settings App "Remember to setup Apple Pay" promos. I do value user education. They need to consolidate all of the feature promo services into a revised Tips tool that allow users to engage with new features at their own pace.
As a former Apple employee that left in part due to declining software quality (back in 2015!), and the relentless focus on big flashy features for the next yearly release cycle, I could not agree more.
I recently had to do a full reinstall of macOS on my Mac Studio due to some intermittent networking issue that, for the life of me, I could not pin down. Post-reinstall, everything's fine.
Also as a former Apple engineer....
I've explained in another thread how this kind of thing happens. It may be the same at other large companies.
Bugs come in (via Radar) and are routed to the team responsible. Ever since Jobs came back (and Apple became valuable again) it has also become very much top-down with the engineers, for better or worse, not calling the shots.
Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not. That is a "marketing" decision (well, probably Federighi). But further, even for an engineering team, they're probably not going to be able to make that decision even for their own component(s) either. Again, marketing.
So meetings are held and as it gets close to time to think about the NMOS (next major OS) the team is told what features they will implement. Do you think fix bugs is a feature? How about pay down technical debt? Nope, never.
Fixing bugs is just expected, like breathing I guess. And technical debt ... do what you can given your workload and deliverables. Trust me, many engineers (perhaps especially the older ones) want to both fix bugs and refactor code to get rid of technical debt. But there is simply not the cycles to do so.
And then what is even more insipid, the day the OS ships, every single bug in Radar still assigned to a team, still in Analyze, becomes a much much harder sell for the next OS. Because, you know, you already shipped with it ... must not be that bad.
I'd love to see a bug-fix-only Mac OS release. But I suspect that every time the possibility has come up, something like, I don't know, LLMs burst on the scene and there's a scramble.
> Ever since Jobs came back (and Apple became valuable again) it has also become very much top-down with the engineers, for better or worse, not calling the shots. Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not.
It's unclear how much explanatory value this has, because the Snow Leopard that everyone is pining for was during the Jobs era. After all, an Apple that goes bankrupt and out of business isn't going to make any software updates.
I find a stark difference between the Jobs era and the Cook era. Under Jobs, the early Mac OS X updates (Puma and Jaguar) came fast and furious, but then the schedule slowed considerably. Panther was 14 months, Tiger 18, Leopard 30 (delayed due to iPhone), Snow Leopard 22 months, Lion 23. Mountain Lion was the first release after the death of Jobs and came only 12 months after Lion. Thereafter, every Mac OS update came yearly, give or take a few months. That's a drastic change in release schedule.
Yeah, I should be careful to not make it appear as though there were so clear a delineation when Jobs returned. His software engineering team got to work reshaping MacOS (as we know it now) but he seemed to this software engineer to be focused on hardware and "strategies" initially.
Aqua, the new UI, came down from above soon enough. Drawers, toolbars were new UI elements that arrived. In time Jobs' designers were going through the shipping apps with these new UI elements with changes for the engineers to implement.
Certainly by the time the iPhone had arrived the transition to marketing (and design) calling the shots was complete.
Apropos Drawers: The may have looked a little bit silly back then but today almost every Mac app main windows has a big grey sidebar, so that in Exposé view almost all windows look the same. Drawers got an unfair rap, I think.
It's crazy that marketing hasn't worked out that quality and reliability can be spun as a feature. In fact, I remember with OS X, that was the baseline word-of-mouth feature when the comparison was made with Windows at the time.
"It just works"
> Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not. That is a "marketing" decision.
I think it is more that the decision to SAY Snow Leopard was a bug fix-only release was a marketing one. The reality is that release also sported things like 64-bit Intel ports of all apps, added Grand Central Dispatch (e.g. an entirely new code concurrency system) and included a from-scratch Finder rewrite.
I always saw these releases (I bundle Mountain Lion in) were all about trying to rein in excessively long release cycles. Short release cycles tend to not have enough time to introduce new bugs, while extended release cycles create a sense of urgency to get code in under the wire.
Now, release cycles have moved to be staged across a fairly predictable annual calendar. If there's an issue where features are getting pushed out 6 months or a year earlier than they should, that is a management and incentives problem.
Yup. Well-said. I experienced exactly this type of thing during every NMOS planning/brainstorming session I was a part of.
>Because, you know, you already shipped with it ... must not be that bad.
This hits right in the feels of any engineer at any company.
I don't even know what these big flashy features are anymore. Every year I get asked by staff "Can I upgrade to <latest major Mac OS>" and every time I tell them they can, but they won't see anything different. There's not even big architectural changes under the hood to improve stability or performance.
Short of it being a requirement to use the latest version of Xcode (once they bump the minimum in the following Feburary), and security updates stopping, there's been very little reason to actually upgrade.
>As a former Apple employee that left in part due to declining software quality (back in 2015!), and the relentless focus on big flashy features for the next yearly release cycle, I could not agree more.
Oh Thank You so much. 2013 I was already questioning on some of the features it keeps adding that were useless. Yosemite with continuity was the only useful feature in the past 10 years.
Yes. relentless focus on big flashy features for the next yearly release cycle was exactly what I felt like it was. And that was the big reason why I dislike Craig Federighi.
Edit: Thinking about it more, former Apple employee that worked during 2005 - 2010 is probably a lot more prestige than post 2015.
They need a Snow-IOS too.
- Ever since I've updated to the latest iOS 18, my watch complications(weather doodad) stop working randomly because they just lose the location services permission. Then in settings, the location services permission list acts like the weather app isn't installed.
- The new Mail app now automatically classifies your email, but still gives you the "All Mail" option. But the unread count badge on the app only works off of what they classify as your "Priority" mail. There's a setting to change that, so that it shows you the unread count of ALL mail, not just priority mail, but when you change that setting nothing changes. This is my biggest problem with new iOS.
- Keyboard sometimes doesn't get out the way any more when it should.
These are just off the top of my head. It used to be such a nice, polished experience. Their competition was just outclassed. Now, when my phone dies I'm going to have a good look at all the other options.
> - Keyboard sometimes doesn't get out the way any more when it should.
Depends on where you were seeing this of course, but this could very well be an app problem instead of a system problem.
Native UIKit/SwiftUI do a little bit of keyboard management for “free”, but there are many circumstances where it falls on the developer’s shoulders to do this. For cross platform frameworks, some do keyboard management others don’t even try. For web apps it’s a coin toss and depends on which of the gazillion ways the dev built their app.
It’s not actually that hard, usually just a matter of making sure that your scrolling content either resizes to match the keyboard-shrunken viewport or adding bottom padding equivalent to the height of the keyboard and then and adjusting scroll position accordingly, but it’s not unusual to see this partially or fully absent, especially on poorly built cheapest-bidder-contracted apps.
In modern UIKit it's as simple as constraining to the keyboard layout guide. That gives you full animation support for free as well, no more need to listen for the notification and manually set up animations with the same timing and curve. On iPads the keyboard guide can even help you avoid the split keyboard, it's really nice.
Of course SwiftUI gives you almost none of this control, forcing you to hope the magic automatic support works how you expect.
But then neither help you with any of the other interactions, like any background dimming you may want, or tapping away from the keyboard to dismiss. That has to be done manually.
Permissions needs a complete rewrite. Layers and layers of permissions screens. To get anything done takes 4-5 forward and reverse UI stack traversals
Absolutely. And turning off Siri's "Learn from this app" should not require the user to navigate to every single app's menu, when Siri has a top level page in Settings.
The division of per-app vs app list in general is bad.
I think they should just throw in the towel and duplicate settings. Meaning, we can turn off Siri learning from an app or from the Siri page. Or we can turn off banners from the app or the notifications page.
The recent Photos app update was a major regression.
my iPhone gets into a state lately where a pane will suddenly lose the the ability to _scroll_. it can happen in any app, but I see it a lot in Safari. Like, what is even happening, this is a fundamental UI interaction. The only way to fix it is to close the tab or force-quit the app. Super weird.
I don't think that's quite right. Snow Leopard was a lot of changes to a lot of the OS code base and wasn't great out of the gate, taking multiple dot releases, like all large-scale software updates do, to stabilize and bugfix enough to be "good."
There is no silver bullet, just a lot of lead ones and the answer to Apple's quality problem is to begin baking QA back into the process in a meaningful way after letting it atrophy for the last decade or so.
Hire more humans and rely less on automation. Trust your developers, QA, and user support folks and the feedback they push up the chain of command. Fix bugs as the arise instead of assigning them to "future" or whatever. Don't release features until they're sufficient stable.
This is all basic stuff for a software company, stuff that Apple seems to have forgotten under the leadership of that glorified accountant, Cook.
> the answer to Apple's quality problem is to begin baking QA back into the process in a meaningful way after letting it atrophy for the last decade or so.
As a former Apple employee of 13 years: Apple knows about the bugs. QA isn’t the problem.
A lot of people complain that their radar for some obvious bug isn’t getting noticed, and conclude that Apple must not be QA’ing, or not dogfooding their own product. This isn’t the case at all. I guarantee the bugs you care about are well known, and QA has already spotted them.
The reality is, they just don’t care. The train leaves the station in September. You’re either on it or you’re not. If you spent the year rewriting some subsystem, and it’s July and you have this huge list of bugs, there’s a go/no-go decision, and the answer is nearly always “go” (because no-go would mean reverting a ton of other stuff too, and that carries its own regression risk, etc.)
So instead there’s just an amount of bugginess that’s deemed acceptable. And so the software is released, everybody slaps high-fives, and the remaining bugs are punted to next year, where they will sit forever, because once we do one release with a known bug, it couldn’t be that important, right? After all, we shipped with it! Future/P2, never to be seen again.
An attempt was made to remedy this by pushing deadlines earlier in the cycle, to make room for more QA time, but that just introduced more perverse incentives: people started landing big features in later dot-releases where there’s less scrutiny, and even more tolerance for bugs.
The honest answer is that Apple needs to start giving a damn about the quality of what they’re pushing. As Steve once said at a pretty famous internal meeting, “you should be mad at your teammates for letting each other down like this”. And heads need to roll. I can only hope that they’re realizing this now, but I don’t feel like the culture under Tim works this way. People’s feelings are way too important, and necessary changes don't get made.
I think some people would be surprised how effective reaching out to apple is for squashing bugs. Three times now I've been assigned an engineer to pin point the bug I was experiencing, after which it was fixed in the next dot release.
By all means people should complain on forums (why not?), but a forum post complaining about some years-old bug isn't going to be anywhere near as effective as contacting apple's support or filing a bug report.
I'm not a developer, I'm just a regular user - so if I can get all this special treatment, so can you.
Yes, I am very surprised to hear that you've had such success with reporting bugs to Apple. That is very unlike my experience. I've had exactly one macOS bug that I reported fixed, and that required going to a WWDC lab, talking to a person on the relevant team in person, and having them dig the bug report out of the backlog for a completely unrelated team that it was incorrectly assigned to.
They would be surprised because it's not true, those years-old bugs in the forums have been reported many times to the official bug tracker, with reference number sometimes posted in those very forums.
You must be the lucky one, because other people have had horrible experiences with Apple’s Feedback Assistant: https://news.ycombinator.com/item?id=38164735
Interesting. Apple podcasters frequently rant about what a black hole Apple's Radar bug system is. We're talking hours-long rants in some cases. Luck of the draw, maybe? I'm not doubting you, just surprised to read it.
(It feels similar to how those same podcasters absolutely blast Apple Intelligence, while non-tech users I've heard from seem to love it.)
Adding to this, a solution might be enabling continuous releases and leaning into release channels could help in terms of getting more out to users.
In practice it's a challenge because the OS bundles a lot of separate things into releases, namely Safari changes are tied to OS changes which are tied to Apple Pay features which are tied to so on and so on.
It would require a lot of feature flagging and extra complexity which may reduce complexity.
Another way is to start un-bundling releases and fundamentally re-thinking how the dependency graph is structured.
I think they’re painted into a corner with WWDC. Everything has to be a crowd pleasing brain busting wow drop each year. I’m certain there are teams that design their entire workflow around the yearly wwdc. It honestly feels like an executive leadership problem to solve.
If that is a significant part of the problem, then moving WWDC from an in-person keynote attended mostly by nerds and glanced at by the media to an overproduced movie geared at the media and ordinary consumers first probably didn't help. They could've gone back to a stage presentation after COVID, but some of that transition had already been happening prior to that (I recall an increase in how many jokes/bits they were doing in the late 2010's, although that could just be my perception).
Appreciate the sentiment, but in my humble opinion, seems like they should lean into creating even better automated testing, because adding all the new bugs to their suite of automated tests would be a more certain way to decrease their chance of happening again.
But, in a sense, this still incorporates your idea, because the devs and QA must be given the mandate of finding these bugs, and also towards making the automated tests cover the bug's related test cases (as well as charged with improving the test code itself, which is often in a mediocre state in most code bases I've seen at least).
Sure, more and better of everything, with engineering, including QA, calling the shots on what's sufficient to ensure great quality.
Why do any of that? What they're doing has made them infinitely rich, and that's all that matters. /s
Being infinitely rich might also be the cause of the problem.
Well, you can only win playing the stock market (Wall St. is Cook's only real customer) for so long while your products deteriorate. Financializing Apple and eliminating its technical prowess opens the door for the someone else with contemporary technical strength to take Apple's users.
Snow Leopard was macOS moving so slowly people thought Apple were abandoning the Mac.
Apple changed how they tied OS updates to hardware sales in this era and this left a lot of Macs on Snow Leopard for half a decade. So people remember that last point update – which was as close to a low-term-stability release as Apple has ever had.
But to get there, Snow Leopard received 15 updates over 2 years and it was really just point updates to Leopard so it was more like 29 updates over 4 years without a major user facing feature. And this was after Leopard itself took over 2 years to develop.
If Apple did nothing but polish features and reduce bugs for 6 years, people would proclaim them dead. And they might actually be dead since their entire sales model is tied to cycles of development, promotion and delivery. For those of us who remember Apple getting stuck on System 7 between 1990 and 1997 and how the company nearly collapsed in that era: it would be a delay almost on that scale.
It didn’t have anything to do with Sarbanes-Oxley (that was iPhone/iPod touch updates), Apple just charged for OS updates back then.
Snow Leopard was notably cheaper than Leopard ($30 vs $130), Lion was $30 on the App Store, Mountain Lion was $20, then Mavericks and everything after have been free.
Snow Leopard did have a long life though, it was the last OS that could run PowerPC apps, also the last to run on the original 32-bit Core Duo Intel Macs.
Snow Leopard introduced GCD, which was a HUGE new feature. It completely changed how we wrote async code. It just wasn't a huge user facing feature.
Snow Leopard also introduced the Mac App Store (in a point release), which was a user facing feature.
I think the "zero new features" mostly meant "no flashy user facing features". It had a lot of new features for developers.
This is an interesting idea, and I am actually curious what Apple is going to do going forward. A "Snow Leopard"-esque release would be nice, but I think what would be better is an LTS release. Historically, you get a new Mac and you usually only get 5-6 years before they drop your model from the latest release. This has always made some sense to me, as after 4-6 years, you do start to feel it.
I bought an M1 Max that is now almost 4 years old and it still feels new to me. I can't really imagine a change that would happen in the next 2 years that would make this thing feel slow where an M3 would feel sufficient, so I'm curious to see if Apple really does just go hardcore on forced obsolescence going forward. I have a few M series devies now, from M1 to M3, and I honestly cannot tell the difference other than export times for video.
I can imagine some kind of architecture change that might come with an M6 or something that would force an upgrade path, but I can't see any reason other than just forcing upgrades to drop support between M1-M5. Maybe if there is a really hard push next year into 8K video? Never even tried to edit 8K, so I don't know. I'm guessing an M1 might feel sluggish?
Trying to use Wan2.1 to generate AI video or other various LLM or StableDiffusion style stuff is slow compared to other other platforms. I don't know how much of that is because the code is not optimized for M1+ Max (Activity Monitor shows lots of GPU usage) or how much of it is it's just not up to the competition. Friends on 4070 Windows PC are getting results many X faster and 4070 perf iss not even close to 4090
You need to run it under MLX, and AFAICT ComfyUI and the like are not really optimized for it (or at least not as optimized as LLM inference).
I don't feel like they ever used forced obsolescence with Mac's. When they dropped support for the latest OS on your machine it was usually because it couldn't run it. I recently updated some older Mac's and even a couple of OS's before support was dropped things got really sluggish. I imagine with the Apple Silicon machines the OS support will stretch longer than it has on the Intel ones. Maybe the higher prices are a hint they expect people to keep the machines in use for longer than before.
Opencore legacy patcher would be to differ.
> I think what would be better is an LTS release. Historically, you get a new Mac and you usually only get 5-6 years before they drop your model from the latest release
In fairness, Apple to do tend to continue to release critical security patches for older versions.
I suspect that it will be AI features that push Apple into deprecating older hardware. But I also hope that the M series hardware will be supported a bit longer than the intel hardware was. Time will tell.
I don't have any Macs or iPhones that can even run the latest software anymore. My absolute newest Mac is stuck on Ventura 13.7. On the other hand, I can get the bleeding edge version of any Linux distribution out there and run it on decades-old hardware.
Unfortunately, “decades old hardware” doesn’t give me the combination of speed, quietness, battery life and the ability to use my laptop on my lap without so much heat that it puts me at risk for never having any little Scarfaces.
Using an x86 laptop in 2025 is like using a flip phone.
You can at least get 90% of the same experience with modern x86 laptops. Just exclude anything that has a dedicated GPU.
> I bought an M1 Max that is now almost 4 years old and it still feels new to me.
How are the keycaps doing? Mine looked awful after about 2 years of relatively light use, developing really obvious ugly shiny patches (particularly bad on the space bar), quite a letdown on an otherwise great machine.
(Realised that you can actually buy replacements and swap them yourself, via the self-service repair store, so have replaced them once, but am starting to notice shiny patches again on the new set)
Still better than the butterfly debacle of 2016-2019. I have one for work that spends 99.9% of its life docked to a real keyboard and it still has keys that only work sporadically. Some of these keys probably have < 10,000 actuations on them.
Not OP but have the same Mac. Every key is shiny. Doesn't really bother me though because I touch type. Also clearly I favor hitting space with my right hand because only the right side is shiny.
If you have AppleCare they will basically rebuild your MacBook for ~$200. I got MBP M1 Max usb ports and top case replaced and a bunch of other stuff I didn’t even ask for but they replaced with new stuff. Felt like a new machine when I got it back.
They need to somehow start marketing effectively to gamers, because the GPU in your M1 Max is shit. Sure, it’s fine for mostly-2D UIs and the occasional WebGL widget, but for AAA gaming it’s just dogshit.
'Gaming laptops' with more powerful GPUs are generally awful, though. Even ignoring the state of Win11.
Yes, they can theoretically perform better, but only when plugged into mains power, and creating so much heat and fan noise that the experience really isn't good.
Don't think there's anything out there that will outperform the GPU of an M-series Mac without consuming way more power and producing problematic levels of heat+noise.
Sure, but this is another avenue to onboard people to the upgrade train. Sure your display is great, your CPU is great, the speakers are great. But the AAA graphics scale up every year and there are often big performance cliffs for new features on old hardware.
M1 Max @ 32 GB. I can run Shadow of the Tomb Raider with max settings at native resolution (3024x1964 px) and get ~60 FPS.
What about M3 Max?
Interesting take. I'm mostly not affected by that because I use except from the OS itself nearly no Apple software to be not trapped in the Apple golden cage ever. No photos, no Apple mail, no Apple maps, no Notes etc etc. and/but I also use no iPhone. But system settings is awful, at least I can search there to not wrap my head around it.
I actually see progress in things that matter for me as software dev like virtualisation and Docker support. And with frameworks like MLX I can even run image generation tools like FLUX locally on my Mac (search for mflux). Amazing! And Apple Silicone is a screamer... still cannot believe I have the fastest single core PC on Earth in my laptop.
I only thing I use is the calendar to see my personal and work Google calendars aggregated at the same time.
So far I'm happy with macOS. If the whole graphics industry (Adobe etc) would support Linux more I would even switch away to Linux but because I'm dealing with photography, color correction and a little video too I will never switch to Linux (the graphics system quality in macOS is way too good). Windows is unfortunately no go too because of the built-in spyware and ads in the OS (like WTF).
I consider Apple Intelligence also as a sort of spyware. I don't want to activate it ever (but it gets auto activated after updates) and I don't want it to download its stuff and waste space. If people want to use it: fine, but if I personally opt out, I opt out fully Apple!
> system settings is awful, at least I can search there to not wrap my head around it
When it works. Last time I typed “keyboard” in the system settings app, the keyboard settings weren’t part of the results. Ditto “mouse” or “trackpad”. Settings search has been utterly broken on around half of the dot releases for me. If it works, it’s only temporary and then it’s back to not working on the next update (or even reboot.)
Dogfooding is a thing right? Right?!
Working for two companies I see how in the small one people manually test their changes, try to break them, even having in-code tests. At the big corpo - noone cares. Tests are green? Release to prod, close ticket and take another. Clients complain? There are 5-6 layers of people before such complain can come back to the team.
I wouldn't agree with "less glitchy" than Windows. Currently Win10 is the best one if it comes to stability, but Microsoft is already killing support for it. Windows 11 have problems even with typing into Start Menu search - basic functionality. Randomly takes input or not. So I think we are lowering the bar and the market agrees how low it should go.
Absolutely drives me nuts that I can't remove the music icon from the systray in Mac. And ditto on all the spotlight issues.
Also, why does it take 10 seconds for activity monitor to show information? The list goes on.
If only Mac hardware officially supported Linux, I would never touch that macOS again.
> Also, why does it take 10 seconds for activity monitor to show information? The list goes on.
That's not a bug, but a feature. Under View -> Update Frequency, you can change it.
Can't you just drag it off?
It absolutely does. There are so man quality of life issues that plague the platform that don't get addressed year after year. I'm sick of albums syncing to my phone losing their artwork. With Sequoia, I'm sick of running multiple network extensions (you know like Tailscale and Little Snitch) causing network issues.
I'm sick of the random Safari crashes.
When I started using OS X, one of the biggest draws for me was first-class native keyboard shortcuts support that was consistently followed and applied by all apps (first party and otherwise). So you could be sure that a shortcut for search across all contexts (global) would work just as well as the shortcut for a contextual search within any app. No one writes great third-party native apps anymore and even Apple's own apps completely disregard this part of their heritage. Just try searching across the AppStore, Apple Music, and the legacy Finder.
For newer Apple apps, sometimes the keyboard shortcuts simply don't exist. I believe part of the problem here is the deprecation of AppleScript, which means there's no incentive to spend time on consistency, and the other part has to do with organizational indifference towards all the wonderful UX innovations from the past.
What Apple has successfully accomplished, in collaboration with other 'big tech' companies is drastically reducing user expectations from their software. I wouldn't completely blame the AppStore's forced race to the bottom for this alone. There is still a huge market for tasteful apps that cost more (even sometimes with obnoxious subscriptions), but if even Apple isn't leading by example, why waste time on it if you could just build another simple note-taking app.
…and speaking of Snow versions, bring back those cool welcome videos when you first purchase a Mac!
I miss those. Unfortunately, since Apple doesn't do the whole space theme anymore, you'd probably get some really boring drone shots of California at best before a Setup Assistant faded into view from behind a Redwood or something.
Most Mac admins might disagree.
I had to hear those goddamn songs so many times, often all at the same time.
I'm weird though, and never stopped liking it.
Doo-do-doo-doo-do...
That assistant had better be Clippy. "It looks like you're trying to setup your Mac, Would you like help?"
World be cool if they commissioned artists to make music for them again. The fact that they secured Royksopp and Sofa Surfers is still impressive
With the hopes that Apple engineers are scanning this discussion:
- Using the iPhone to scan documents from Finder has recently stopped working on the second scan. I need to restart my phone to get it to work again.
- iPhone mirroring is terrible: laggy, UI glitches, drops click events, scrolling is a nightmare. This is when it actually even manages to connect.
- Often, with Airpods on, lowering the volume, shutting down the iPhone display and putting it in my pocket quickly enough will entirely turn off volume. If you happen to increase the volume instead, you'll get blasted with maximum volume in your ears.
- Use vertical tabs on Safari for one day. You'll see it actually crash a few times. Not to mention the UI glitches. - Open the App Store on macOS. It first opens empty, then the UI controls show up, then it flickers the entire UI. I am convinced it's a Web app.
- In System Settings, most of the sections you click have a delay in rendering. Nothing feels snappy in that app. I can actually click 3 sections quick enough for the second to never even be rendered.
- Sometimes dragging an application from the Dock popup menu into the Trash does nothing, even though it appears to have worked. I often find that it wasn't deleted at all, that I have to open Applications folder in Finder and hit Cmd-Backspace to delete it.
Good idea. I’ll add some that have annoyed me for years just in case:
- On iOS, the alarms app breaks down once you get to ~250 alarms. You can try to add/delete alarms and it’ll appear like they changed, but the change wont be saved. I can’t use the alarms app now and can’t fix it as I can’t delete alarms. By the way, would be nice to reuse alarms when creating at the same time as an existing alarm so you don’t end up with 250+ alarms in the first place.
- On iOS, the notes app breaks down in long documents (~10 pages of text with bullet points). When writing beyond that, some text will sometimes disappear only to reappear when you type some more. Other times, the cursor disappears. This only happens in long documents. All English text, mainly bullet points, often with some text pasted in.
It’s shocking to me that my iPhone 11 Pro can play gorgeous 3D video games, but can’t handle 250 alarms or 10 pages of text..
I still remember the Snow Leopard update wiped my drive clean. Talk about most solid software releases Apple ever put out. Period. Yeah right.
https://arstechnica.com/gadgets/2009/10/apple-owns-up-to-odd...
I feel like they're trying to build too many platforms most of which have become quite large. macOS, iOS, iPad OS, visionOS, watchOS, tvOS. The fact all of these systems are quite tightly linked in terms of features/syncing makes it difficult to navigate. If you want to ship every single year you need more developers, but that might make the collaboration between the systems more difficult. They need to move away from the one year cycle. It's a stupidly short period of time to ship a whole OS (or 6 whole OS's). If you want to keep them all in sync switch to two year cycles and decouple some of the apps from the core OS (e.g. Music, Safari, etc) so they can be updated as necessary outside of the cycle.
The platform teams at Apple don't really work that way. My (limited) understanding is that they share a fair amount of core code, but each went their own way for a while and have recently started getting nudged back together from a UI perspective -- unfortunately, the iOS style guide seems to have won, and many decades of desktop UX is being thrown out with the bath water.
Apple needs to make all of the accessory apps (photos, music, news, maps, mail, etc..) uninstallable and able to be added later if needed through their app store.
Every MacOS update brings along this bloatware that is not easily removed.
That's possible in EU (prob also in EEA). Of all you mention only Photos shows a dramatic sliding modal asking you if you want to remove it; data library will stay but features like hidden, recently deleted photos and "memories" won't be available.
Most mac os feature updates are just updates to all that bundled stuff. I use none of that. I use my laptop to run various OSS developer tools, browsers, etc. 99% of it is available on Linux. And I have moved my workflow to a Linux laptop a few years ago. I went back for performance reasons; not for feature reasons. I can do that again. There's nothing really stopping me. But I like the Apple hardware.
This is also the reason that I don't mind the current version of Mac OS. Yes everything you mentioned is a bit meh. Which is part of why I don't use any of those applications. So I don't care. I've disabled Siri. Never used Facetime. Maps, Numbers, and all the other of the dozens of things they bundle: I never touch any of it. I don't need that stuff and when I do, I use alternatives. I have an Android phone so all of the IOS integration stuff is redundant to me as well. They've not locked me into their ecosystem. And I like it like that. I don't allow myself to be locked in.
As a work horse for doing development MacOS is still a fine OS. It does the job. Most updates of the last 10 years or so have been minor window dressing that you barely notice, some under the hood changes, and misc tweaks that mostly fall into the "whatever" category for me. For me the annoying thing is just having to sit through these lengthy updates. I keep postponing them because it's never convenient to take an hours long break when it prompts me.
And I don't really get much out of these updates. To be honest, I can barely tell apart the different versions of their OS. The main notable visual change seems to be the desktop background. Which is usually hidden by applications. So I rarely look at it.
Not just their software, the hardware is beginning to be get pretty unwieldily complicated.
From an OS / software perspective:
Have a "core" macOS that has none of the apps / integrations are baked in at an OS level.
You install the things you want, how you want - eg iMessage, Mail, and then iCloud if you want to sync it, and Photos etc.
Have a slim, fast, stable OS that I can just turn on and get going with.
From the hardware perspective, I made this comment a little while ago but what I want to be able to choose is:
- Device: Watch, iPhone, iPad, MacBook, iMac, Mac
- Size: Mini, "Normal / Default" (Air), Max
- Processing Power: "Normal / Default", Pro, Ultra
- And maybe storage.
That way I can go and buy a MacBook Pro (13"?), or a MacBook Max Pro (15"), or a MacBook Mini (11"), or a normal iPad Mini Ultra, or an iMac Mini (21"?), or a Watch Pro, or a Mac Max Ultra etc.
Device + Size + Power.
It's kinda there, but not quite.
I agree. It seems ridiculous that an app like Messages is considered so much part of the OS given what it does. I don’t use it, I don’t care about it, but it seems like it could be a regular app that updates independently of the OS, along with Maps, Notes, and so on. So many macOS “upgrades” nowadays seem to be Apple tinkering with such apps rather than the actual OS experience.
I did said similar thing weeks ago regarding all shenanigans with W11: every software should include two paths of installation/OOBE: default "express" where vendor shoves you "the experience" and customized "expert" where you select features YOU want. And either way allows you to change system afterwards. We had that in Windows years ago but then it was removed; some Linux distributions do offer package selection beyond the default set.
There's no need for a separated core version - just give back control to the user. But honestly, I don't know what would need to happen so we could get it - it feels like it's a lost cause against corporations. There's of course Apple-EU situation where you can remove applications, set defaults, install additional app stores but this is still limited to that market and happen way too late and too slow.
Why do you want to destroy computing for everybody else just to make a very small group of hackers pleased? Can't you be satisfied with Linux or Windows or BSD? Let normal people have at least one platform that is usable for them.
How is letting users to disable bloat a bad feature?
For example, can you remove Chess from MacOS? Nope! Why? What I found on Reddit, it seems because it's integral part of MacOS somehow and I am a bad person for even asking, somehow.
"iMessage, Mail, and then iCloud if you want to sync it, and Photos"
Those are not bloat, those are core features of a computer for 99% of users who are not developers.
There already exists a platform which is unusable for normal people and great for developers, it's called Linux.
There already exists a platform which is great for corporate and hell for normal people, it's called Windows.
So why aren't we allowed to keep the only computing platform which is good for normal users?
Chess is 11Mb, you must be desperate for space if you want to remove that.
Bloat adds up.
Each part alone might not be large but together it starts to become an annoyance.
Also bloat is not just about disk space but also cognitive load and clean interface.
I'm still waiting for them to remove Launchpad (which seems like a half assed step towards unifying desktop and tablet systems), and I've yet to meet anyone that uses their weird new desktop management system, the thing with the windows on the left side. That just reminds me of the GUI experiments they did in the 2000s, with 3d environments and whatever Ubuntu (or gnome/kde/whichever) tried to do.
I'm hoping they're gathering usage analytics and will overhaul unused features.
Caveat, I'm probably not their average user, I do almost everything via Spotlight. I don't even use the bottom menu thing, it automatically hides and I only use it when I accidentally hid a window.
Launchpad has to be one of the worst ways to find and open applications on a Mac. That new window-managing system is honestly so unintuitive, so bad, and so bizarre that its mere existence feels like some sort of practical joke.
I wish that in the next version of macOS, they would strip away all those useless features and systems that they've shoehorned over the past two decades and have the OS look like how Panther or Tiger did, while taking up less than 10 GB of space on the puny SSDs that they ship their machines with.
I appreciate that they did UI experimentation and stuff but... not to the end user's expense. I wonder if anyone at Apple themselves actually use these features.
> In the 22 years since I became a “switcher”, this is the worst state I can remember Apple’s platforms being in.
Indeed, I remember three times when Apple went a bit overboard on the feature front, but dialed it back and made some of the most stable and useful OS versions:
OS 8.5/8.6 pushed a bunch of features and were the last big pushes pre-OSX, but then OS 9 fixed a TON of bugs, and added a few smaller quality of life improvements that made running 'Classic' Mac OS pretty good, for those who were stuck on it for the transitional years.
Mac OS X 10.0 rewrote _everything_, and especially 10.0 was _dog_ slow, with all the new Quartz graphics stuff in an era where GPU accelerated 3D display widgets wasn't quite prevalent. 10.1 patched in a bunch of missing features (like DVD Player—it was still a pretty useful tool back then), and fixed a couple of the most glaring problems... but 10.4 Tiger was the first OS X release that was 'fast' enough OS X was a joy to use in the same way OS 9 was at the time. At least on newer Macs.
And then of course Snow Leopard, which is the subject of the OP.
macOS 13/14/15 have progressively added more little bugs I track in my https://github.com/geerlingguy/mac-dev-playbook project; anything from little networking bugs to weird preferences that can't be automated, or don't even work at all when you try toggling them.
That's besides the absolute _disaster_ that is modern System Preferences. Until the 'great iOSification' a few years back, Apple's System Preferences and preference pane were actually a pleasure to use, and I could usually remember where to go visually, with a nice search assistant.
Now... it's hit or miss if I can even find a setting :(
Settings is not that bad. It's _awful_, yes, since it broke the panel design we had since the NeXT days, but for me the real annoyance is the way Apple progressively, inexorably broke desktop automation to a point where they now effectively painted themselves into a corner regarding having enough of a foundation to make Apple Intelligence useful (https://taoofmac.com/space/blog/2025/03/14/1830).
That said, I expect things to get worse as they manage to converge their multiple platforms in exactly the wrong way (by dumbing them down across the board even as people keep hoping they'll make iPad OS more useful, etc.).
But at least we still have Safari, Apple Silicon is pretty amazing and I can survive inside Terminal and vim. For now.
Minor nitpick about early OS X
There was no acceleration (even 2D!) until 10.2
Well, actually, there was. I was doing OpenGL stuff at the time on a Bondi iMac that barely ran early OS X and distinctly remember that.
Mac OS X versions before Jaguar supported GPU accelerated applications, but the windows were composited in software which caused severe performance problems. Jaguar introduced something called Quartz Extreme, where the windows are treated as OpenGL surfaces and the window contents are textures mapped onto the surfaces. This made OS X significantly smoother on computers with a fast enough GPU and enough VRAM to support it, as the CPU didn't have to spend a bunch of time copying all the window contents to the framebuffer.
I don't think it does. Long term Apple user here (since 2007). I'm typing this on a 5 year old pile of junk with Windows 11 LTSC on it. The (M4) Mac is sitting next to me acting as an SMB server until I can be bothered to get all my stuff out of it. It's just tiring using a Mac these days. It's difficult to explain but everything feels slightly frustrating. The nice things are really nice. The whole experience is quite nice. Until you hit a problem. Then it's a complete pit of pain and misery and there just aren't enough ways out of it.
Had a few issues with iCloud syncing and data loss as well and what with being based in the UK and the general problems with geopolitics and the cloud I figured I'd try and get as much stuff out of iCloud as possible. Well there's not much advantage now. Most of it is in the ecosystem tie in, not the hardware. And on top of that the provisioned services such as Apple Music are just pain for me on a daily basis. My entire music catalogue disappeared in a puff of smoke when I was offline for nearly a week. The one thing I wanted it for!
So back to the PC. I ran out of disk space on the (soldered in SSD) Mac. I can't delete anything and macOS has leaked out about 20gb suddenly. I don't know what this is other than about 5 gig of it is Apple Intelligence despite telling it to fuck off. So it's late Friday afternoon and I need to get something done so I can have a clear weekend. I dig in the junk cupboard and find a couple of hard disks but no way of connecting them to the USB-C only Mac. Amazon solutions aren't available for delivery until Sunday. There upon I discovered the kids' "covid work PC" for when they were home studying. Despite the acceptable 16Gb of RAM it only had a meagre 256Gb disk in it. No worries. Opened it up and there's a hole for an SSD in it. It now has +500Gb SSD. Brilliant. On goes windows 11 LTSC. I'm back up running R in under an hour and have transferred all the data over.
I never went back. It feels better here. This thing is a swiss army knife. And extension of me. Not the other way round like on the Mac. The Mac feels like it feeds off me: both cash and energy. Apple need to fix that.
> Long term Apple user here (since 2007)
That would be medium-term user. Long-term would be people like me that have been using it since 1984.
I have a collection of Macs going all the way back to 1984. Even the newest one hasn't been turned on in three years.
My daily driver is Windows Server 2016. But it has VMware Workstation so there are lots of virtual machines for my work, including Linux.
I am so tempted by the new M4s. Amazing piece of technology. So sad about the operating system though. Every year I say I'll wait for a quality Linux port.
> macOS has leaked out about 20gb suddenly
time machine?
The “Snow Leopard effect” is more about the transition to Intel from PowerPC than the OS itself.
And maybe I’m a minority but the latest macOS is not worse than previous editions, for instance I use Sequoia on a M1 Mac but also 10.4 Tiger and OS 9.2.2 on a PowerMac G4 (MDD, 2x 1.2Ghz with 2Go of RAM) and the stability is not worse on Sequoia than Tiger or 9.2.2, in fact I have encountered more crashes in 9.2.2 and Tiger than Sequoia and all macOS 11+ (except Big Sur who has rough edge on beginning on M1 device)
See my blog post "The myth and reality of Mac OS X Snow Leopard": https://lapcatsoftware.com/articles/2023/11/5.html
TL;DR What people remember fondly is not Mac OS X 10.6.0, which was in fact very buggy, and buggier than 10.5.8, but rather later versions of Snow Leopard after almost 2 full years of bug fixes.
See also "Snow Leopard bug deletes all user data": https://www.reuters.com/article/lifestyle/snow-leopard-bug-d...
The yearly release cycle is the problem. Apple needs "another Snow Leopard" only in the sense that I mentioned above, "almost 2 full years of bug fixes", although at this point, Apple has more than 2 years of technical debt.
Thank you, the nostalgia for a 15-year-old OS release, which absolutely was not great out of gate, is strange.
My recommendation for people who don't absolutely need the latest features: Upgrade to the previous version of macOS when the new version is released. Sequoia is incredibly reliable 7 (soon to be 8) updates in.
> Sequoia is incredibly reliable 7 (soon to be 8) updates in.
I disagree with that part. ;-)
We wouldn't even be having this discussion right now if today's updates were incredibly reliable.
> later versions of Snow Leopard after almost 2 full years of bug fixes.
This is what’s being asked for in the article.
I disagree. From the article: "The same year Apple launched the iPhone, it unveiled a massive upgrade to Mac OS X known as Leopard, sporting “300 New Features.” Two years later, it did something almost unheard of: it released Snow Leopard, an upgrade all about how little it added and how much it took away. Apple needs to make it snow again. Snow Leopard did what it was made to do. It was one of the most solid software releases Apple ever put out." This gives the impression that it was solid out of the gate, which it was not. And the next paragraph specifically mentions "2009’s Snow Leopard". But later Snow Leopard releases were in 2011.
Grandempire is right on my overall sense in the piece, though perhaps I should have made its ore explicit. I actually faired quite well with 10.6.0. But, the lack of push for a yearly set of headlining features did allow the OS to age quite well in the years after, too. It's the drumbeat of what 10 stunning new features will be unveiled each WWDC for each platform that means past features rarely get the continued polishing they need to shine.
the myth has indeed become everyone's reality
In my own experience, I have noticed that Apple's software 'breaks' more on older hardware, be that Mac's, iPhones or iPads. For all the credit apple gets for supporting older devices, those devices are definitely not treated as first class citizens. For example, the touch keyboard on my (work) iPhone 12 Pro works decidedly worse than on my (private) iPhone 16 Pro. The error rate is much worse, and I believe it's due to the amount of useless features that get added with each new installment of iOS.
Whether that's intentional or not (I believe it is), Apple should focus more on delivering a stable experience, on both new and old devices.
I echo the sentiment a lot of people have already expressed. That is, using Apple products is like being a junkie. You need to use their products because there is no real alternative, but you feel kind of dirty because of their practices.To me, that sounds like it should be a huge red flag for Apple execs.
Adding my comment as reply as well as it is relevant:
---
I've been holding over and running 10.5 on my iMac 2019, but then in the beginning of the year had to upgrade to Sequoia (due to software dependencies).
Of course this is just a correlation, not necessary a causation, but within a month the iMac's internal SSD was corrupted to the point that it was unrecoverable, and my 40GB RAM corrupted.
So, yeah, at the very least not sure how much testing went into Sequoia for non Mac Silicon macs.
Quite disappointing considering how long a normal Mac's lifetime used to be, which also justified its high initial hardware price.
The RAM got corrupted?
This is not only applied to Apple's software. The entire software and hardware market including iPhone, Samsung Galaxy, Windows, etc. is pressured to release new products with more and more features every year, advertising those new features to facilitate sales. The result is, what was once a simple and cool product has become heavily bloated with unneccessary features.
The Nero Burning ROM/ACDSee disease is how I like to call that. These were simple once too but quickly degraded on quality, got bloated with stuff nobody ask in the first place
It's curious how the author mentions iMessage. No Apple user among my friends in Europe uses that. Zero. I guess in the US this is very much not so.
It's all WhatsApp, Telegram and Signal here, nowadays. I.e. they wouldn't know about bugs in iMessage as they never open it.
I'd be curious to hear about other regions of the world. Do people there use iMessage?
I think it's different from country to country, here in Sweden I think iMessage is reasonably popular, and people here generally go to Facebook Messenger rather than WhatsApp when it comes to cross-platform communication.
iMessage is a very limited and glitchy app when it comes down to it.
Just from the top of my head: no E2EE by default. Gifies are restricted (and censored). Reactions are clumsy (there are two rows of different kinds of emojis to choose from now). Adding photos or sharing location is complicated compared to Signal or Whatsapp.
Search is ... well, I hope you don't really need to find anything. Delayed notifications on macOS for no apparent reason, and in 2025 you can still end up with multiple entries for the same contact...
If we look at OSX Releases [1] ; from OS X 10.10 Yosemite in 2014. The only useful feature for me was Universal Clipboard. That is 10 years of macOS and that was about the only user features.
While the 10 years have some security, performance, drivers, file system, refactoring going on. Most of the user features were useless.
And I spend 90% of my time inside Safari, and yet Desktop Safari is still shit after all these years.
I am not excited about 99% of new macOS user features. Most of them are features for features sake. Just continue the macOS engineering work, and for once pour more resources into Safari and allow Safari support on older Mac system.
[1] https://en.wikipedia.org/wiki/MacOS_version_history#Releases
This alone says a lot about Apple's software "prowess", i.e perennial customer hostility combined with clear incompetence, (in which their "core" customer base has by now becomes participants in some kind of Stockholm syndrome scenario), that an attempted de-shittification of their OS is being hailed as (nostalgia tinted?) greatness :)
> Apple’s iMessage and SMS tool is an essential app for communication for me and, I suspect, the vast majority of Apple users.
For the majority American Apple users, sure. But I myself hardly ever remember that this app exists.
The thing that drove me nuts in particular in Sonoma though, is their "improved" text fields. Where it would show the stupid little popup with the active keyboard layout icon next to the cursor. Clearly made by someone who doesn't actually need to use multiple keyboard layouts (gosh do I envy those people). But at least I could disable it with a defaults write command.
Oh and Mail, yes, it would sometimes stubbornly refuse to load new messages, or delay them by minutes. It worked fine the previous 10 years. It would've been free to just leave it alone.
> Oh and Mail, yes, it would sometimes stubbornly refuse to load new messages, or delay them by minutes. It worked fine the previous 10 years. It would've been free to just leave it alone.
Oh man, Mail is almost comically bad, to the point that I occasionally miss messages from people since they're drowned in crap. A native version of Google Inbox that is not Google-owned would be enough for me. (or whatever version/implementation that integrates nicely with my devices)
> But I myself hardly ever remember that this app exists.
As a counterpoint, I myself use it everyday. I’m not American and most people I know don’t have iMessage. I still prefer it to using SMS from the phone. And yes, I do agree with the author that the app is buggy.
Well, I use an Android phone :D
But I also never chat over SMS with actual people. It's just not done any more by anyone I know. The last time I sent an SMS was probably several years ago. It's 99.9% various confirmation codes and other notifications for me.
Apple needs a Snow Apple. Fire Tim, bring in some real change.
Careful what you wish for.
IDK. I'm to the point I kinda would rather they fail and I migrate over to android and continue to use Linux than they keep stagnating.
If the new wave ushers in some real innovation and vision, then it was worth the gamble.
Last week, I switched to a Mac for the first time in my life after using Windows and Linux for around 30 years. Naturally, I hate a lot of things due to old habits, and the shortcuts constantly confuse me. But what really surprises me is the number of obvious bugs in common workflows. At least five times a day, I catch myself thinking, "There's no way this is actually broken." I didn't expect macOS to be even buggier than Windows.
That said, the hardware and the absence of Windows' user-hostile nonsense bring me endless joy. I don't think I'll go back to a PC (the Mac feels like a different class of quality) but to be honest, I expected more.
Off the top of my head I'm mainly thinking about `cut and paste` in Finder, that's a very common one people complain about, but other than that I'm curious what you're referring to if it's happening five times a day with new things, any chance you could outline some examples?
Examples just from today: Window snapping (or whatever it's called on mac) stops working until restart, keyboard type detection gets broken because it thinks my mouse is a keyboard so suddenly " and > are replaced, title bar disappears then the apple logo is halfway off screen when it randomly comes back.
I took a Mac at my current job since I really don't like Windows and I figured I would probably be able to hack it. I use Linux for all my personal stuff, all I need is bash and a browser, yeah?
Pfft. Nothing works, and a patronizing, laggy OS that actively tries to fight me at every step because it knows better than me.
What a joy. I'm sticking with Ubuntu/Fedora and having to figure out a driver issue every once in a while.
It is literally insane that when I search for Photos on iOS I can't zoom in to make the thumbnails bigger. As an approaching mid-40s person this is untenable, even worse that it DOES let you zoom in prior to search.
Photos definitely regressed in last release. I like change and new things (AI searching/tagging photos is extremely useful) but when they changed it I realized how important my muscle memory was for that app and features like pulldown iCloud sync/status seems to be gone and other small things changed in annoying ways.
The lack of filters in things like Photos or the iCloud version baffles me. Tools that would be effective and far more useful than half of what they add instead.
I find it incredibly developer hostile as an OS now. I don’t want to have to type a password in to use a debugger. I want to be able to download software and run it as I want, whoever wrote it, without them having to sign it. All that does is push people away from supporting Macs, particularly if they’re learning and don’t want to shell out £99 for a developer license. And you can see that because the Mac ecosystem has become dramatically less varied and stagnant.
For me, I hiched my wagon to the Apple team, years ago, and have held on, through some truly disastrous times.
I can't predict whether or not they will get past this, but I'll keep hanging on, anyway.
The code quality (the bits they let us see), however, seems to be going downhill, as is the quality of the documentation. These are things that always held up, in the past.
It's fairly discouraging. I suspect the quality of their hires has been going down. I'm not sure what it is, they want, but it doesn't seem to be quality.
Hard to disagree. You would think for a company obsessed with performance per watt and battery life that every release would be as fast if not faster that its predecessor and more efficient to boot.
What Apple needs, is to fix that weird bug where my mouse cursor stops responding to what it's hovering over. How does something so fundamentally broken make its way into an OS?
I like how my cursor randomly enlarges itself for like it's moving between a high-DPI and low-DPI mode despite being entirely on one display and nowhere near the edge between my internal and external. It gets big for maybe 1 second and goes back to normal.
> It gets big for maybe 1 second and goes back to normal.
That sounds like the "shake to locate" behavior and I would be totally lost(heh) without it since I have 2 4K monitors plus the onboard display, and that black cursor gets lost very easily. Shake the mouse, get big cursor, find cursor, be happy
It appears that one can disable it if it bothers you enough to comment about it: https://support.apple.com/guide/mac-help/make-the-pointer-ea...
And while there, I learned that I can change the pointer color, so hopefully everyone has learned something valuable today :-D
Oh, neat. I had no idea it was intentional but I can see how that would be a useful feature. It felt like a bug to me because it felt disconnected from any intentional cursor movement on my part.
I believe it’s also optional, so you can turn it off.
this drives me insane, I only ever encounter it in Safari, but I spend most of my workday in Safari as a web dev, so it might be the reason why.
I don't think it's just an Apple thing. I think it's just a big company thing. For example, the YouTube app has so many errors in the very common path, such as opening comments on channels and so on. I think after a while big companies simply become hollow from the inside and self-combust. Just like large animals have a cancer protection gene, I think there is a max size companies can get before they sell combust and they do not have a cancer preventing gene.
The article is spot on and articulated my feelings exactly. I too became a loyal Apple supporter nearly 20 years ago because "it just works". Sadly, I no longer feel this way. The operating systems on my Macbook, iphone and iPad have consistently gotten worse with each update over the last 5-10 years. Apple is losing the magic on the software front.
Apple got hooked on money. Behead everyone at the helm, let some fresh air in.
> I am not suggesting Apple has fallen behind Windows or Android. Changing a setting on Windows 11 can often involve a journey through three
Love how you can't find a critique of Apple without the person feeling the need to throw shade at Windows. They need to constantly reassure themselves and other fanboys it was the right decision.
And for an OS that's geared to your grandmother it sure does seem to shit the bed often.
You could argue that macOS development is too slow, not too fast and in need of a maintenance year.
Basic OS features have fallen way behind in term of UX - and of vision. Managing files and searching for information have become a chore compared to most internet- or llm-based services. Even a bug-free Finder or faster Spotlight would not bridge that gap.
All apps listed in the article feel similarly lost behind - Mail, Messages, Photos. The only exception is System Settings that does definitely need a snow version.
This is obviously true for other platforms as well.
We are possibly lacking a leap forward. Not faster horses, electric cars.
An obvious root cause of this is the lack of newcomers to the OS again. It's an oligopole that has no interest making things much improved.
> In an era when people still paid money for operating system upgrades every few years (anyone else remember standing in line for Windows 95?)
No, and I would have been too young to purchase it.
But I'd be surprised at the idea of massive demand for an upgrade to Windows 95. What we did was buy a new computer that had Windows 95 on it. Computers used to go out of date very quickly.
We kept our older computer that ran DOS. (It had Windows 3.1 installed, but the only reason you'd start that was if you wanted to play Solitaire.) It continued to run DOS just fine.
Believe it or not, it was a huge deal. I went to two launch parties -- one hosted by CompUSA and one by a local place -- the night it came out. This wasn't in Silicon Valley, but in the U.S. Midwest (St. Louis, MO). Hundreds of people stood in line at midnight to get the first copies at the two places I went to and the same thing happened all over the world. (Of course, CompUSA also had a whole display of upgrades to get your computer running better if it wasn't ready for Windows 95.)
The same sort of late night excitement existed around each early Mac OS X release, incidentally.
The value just wasn't there in giving your old computer a software update.
Most notably, that computer with Windows 95 on it also had a CD drive.
Well, computers were expensive enough many of us just kept upgrading parts to make something like Windows 95 work. My 486 slowly morphed into a Pentium system with largely the same parts over the course of three years during that time. But Windows 95 worked great on my 486 -- it felt like a great upgrade at the time.
I know it doesn’t affect a lot of people, but pasting in hex mode in Calculator broke in Sequoia. Previously, any number pasted in hex mode was treated as hex (as expected) even if the number consisted only of decimal characters (say 20, which would be decimal 32). Now numbers pasted with only decimal characters are treated as decimal (pasting “20” turns into 0x14) and numbers with with at least one alpha hex char is treated as hex. The workaround is to prefix the number with “0x”, but that’s not always practical.
I think they might have fixed this in the latest beta.
Snow leopard was my favorite operating system ever. I used it on my first real computer, a horrible Asus eeepc netbook and it worked flawlessly. Best hackintosh I've ever used. Of course I used it on official hardware as well but it brings back fond memories.
https://www.svenbit.com/2011/02/install-hackintosh-on-eeepc-...
It's not even that Snow Leopard was so great. It's that what came immediately after was so poor. Lion was noticeably janky. Things seemed to improve again with Mavericks becoming quite stable after numerous point releases. Then there was a glimmer of hope around High Sierra/Mojave/Catalina, but since then it's been steadily downhill.
Huh, I was actually on this page a few years ago, but iOS and MacOS quality has been super solid for me this past year. Anyone else feel this way? Judging by the nodding comments maybe I’m just the outlier?
I've been using OSX / macOS since 2002. I've not really had many issues if any that I can remember or found noticeable (or knew they were attributable to the OS). I can't really use Windows or Linux because I've been quite accustomed to the incredibly useful accessibility tools that come with OSX / macOS which are first class, and probably worth a whole lot more than I paid for the hardware.
Wow, that 2013 WWDC video is so incredibly impactful. I had no idea I was going to experience what I did when I hit play. It resonates with me so strongly, I honestly wasn't ready for it.
You're right. When I first watched it, I was under no doubt they lived and breathed that philosophy. It matched my perception of their output 100%. Watching it again now, I'm reminded of how I used to feel and how much things have changed.
Yes. I remember it strongly hitting me back then, but rewatching added even more punch. I still agree with the philosophy, but this time I was also wistful for when I could say Apple agreed with it, too.
I have two Apple TV 4Ks and both started dropping Bluetooth connections every minute or so after a recent upgrade.
My headphones will cut out and when I go to pause the video I’ll be clicking frantically because the remote isn’t working either. Or I’ll be in the menu and the remote will pin to the left or right and scroll to the end of some massive YouTube list.
Reboots, resets, nothing fixes it.
My Apple Watch regularly has a glitched Home Screen too.
I defended Apple’s quality recently, right before everything started breaking for me.
Oh! And the web version of the Podcast app hasn't been display or toggling "Saved" state correctly for awhile now.
It's wild how these kinds of glitches used to be outliers, and now they're starting to feel... normal?
I thought it was because of my MacBook pro still being an Intel One and thought that nobody at Apple cares about that anymore. But it seems also the M family suffers from it.
Mine doesn't really sleep. It's always warmish despite all my best efforts to make it actually sleep. It's always plugged in, so no biggie, but it's annoying as hell.
Reddit wisdom says it's because of my usb peripherals, but it's just a webcam, mouse, keyboard, and a yubikey.
Open Activity Monitor, go to the Energy tab and look for any apps that are marked as preventing sleep.
No apps preventing sleep (I checked that already many many times), that's why I believe it's OS related in some way.
Absolutely agree. This week, I can't get Chrome to connect to local servers.
ERR_ADDRESS_UNREACHABLE it says.
Yes, I said Yes to the new permission. Yes the check mark is on in Privacy, I mean all 20 of them that say "Google Chrome". Yes I toggled it off and on. Yes I rebooted. Still have to use a different browser to access my own local server because there is a new privacy feature that... doesn't work.
> Yes, they work and are still smoother and less glitchy than Windows 11, but they feel like software developed by people who don’t actually use that software
I would have to agree here (and Apple also don't seem to assess feedback for their GUI changes), but unfortunately this thread is already on a software quality meta tangent rather than listing individual annoyances so here's my short list in the hope actual bugs can be discussed:
- window focus management broken: when you minimize or close a window, another random window of that app you're closing the window of is put into front even when that window is minimized; or other completely unrelated apps get focus
- index/Spotlight not showing file locations (full paths) after searching; the fsck?
- gestures being introduced that do stuff that you hit inadvertently and leave you in a state where you don't know how to undo its effects such as the super-annoying "fullscreen" mode when dragging windows around or pressing Command-F since Sequoia. Requires you to fscking research how to leave fullscreen mode (while not as cringe as Windows help "communities", the level of talking past another is getting there, options being discussed that don't exist in Sequoia's Dock/Desktop settings)
- update or feature nagging (I don't care I could use my iPhone as a webcam right now, go away)
- sometimes difficult to find mouse pointer on large screens
- older problem but I know at least one person on the verge of leaving Mac OS because of it after 20+ years of loyalty (or outright fanboyism tbh): in a German locale, you can't switch off PC gender-neutral language which is not only pushy and annoying but also space-inefficient as fsck
For the mouse pointer thing: you can shake your pointer left and right rapidly, and it will increase in size. It's quite handy to find the pointer.
If this is the thread where we're adding gripes:
- How tf does it take upwards of 5 seconds to take a screenshot with modern hardware on a fully updated OS. How.
- And why do screen recordings sometimes randomly disappear into the bowels of the OS, where even Support struggle to find them?
> window focus management broken
Although I do not mind the way that window management works on macos, recently I had a mildly infuriating situation. I was doing Cmd+Z to undo something, not sure which app and it didn't work so I pressed it a couple of times instinctively. But although my target app was visible and on top, it was Finder that was actually in focus - accidentally I triggered undo in Finder. I think I managed to undelete a file and something else, but I'm not sure. Not sure if there is a way to find a log of actions. That's something I would love to see in all desktop systems, a history of user actions. Also having undo/redo shortcuts in Finder is potentially destructive, what if I move some files from an SD card, reformat it in camera, and then accidentally hit undo in Finder?
> moving the Mac to a new processor architecture (for the second of three times)
Four times kinda — maybe five if you want to count PPC32 and PPC64 separately but I usually don't since the Intel transition happened so soon afterward that there is really no PPC64 lineage to speak of.
I definitely count 32-bit and 64-bit Intel separately though due to the number of years taken to transition, all of the annoying early-Intel-Mac 32-bit EFI issues, and the need to manually opt in to the 64-bit kernel on many machines. In fact Snow Leopard was the first OS to let you do so! The “no new features” tagline was snappy but it's really not true at all :p
https://apple.stackexchange.com/questions/261749/in-which-ve... sez —
“Mac OS X Snow Leopard and above could only be installed on Macs with Intel processors, and introduced a number of fully 64-bit Cocoa applications (e.g. QuickTime X), and most applications were recreated to use the 64-bit x86-64 architecture (although iTunes was a notable exception to this!) This meant these applications could run in 32-bit mode on machines with 32-bit processors, and in 64-bit mode on machines with 64-bit processors. And yes, the kernel was updated so that it could run in 64-bit mode on some limited hardware, and only by default on Mac Pros, while other newer Macs were capable of running the kernel bit did not do so by default.”
Relevant articles:
- “Mac OS X v10.6: Macs that use the 64-bit kernel” https://web.archive.org/web/20121024223751/https://support.a...
- “OS X: Starting up with the 32-bit or 64-bit kernel” https://web.archive.org/web/20121024194635/http://support.ap...
- https://macperformanceguide.com/SnowLeopard-64bit.html
The surprising thing to me, is that I have been a Mac user since forever, I think Leopard was my first OS. Things have barely changed since then, there have been some subtle redesigns since then, but the desktop has remained largely static.
I don't understand why macOS has so many issues. I still encounter memory leaks and have to kill Finder or Dock every few days lest it eat all my memory.
No. But yes: Get rid of Siri, Make Preference pane alphabetically ordered, Get rid of Spotlight and buy Alfred, Disable notifications by default.
If anything, please let Apple buy Raycast instead of Alfred.
Am I the only one who is perfectly fine with the current macOs/IOS landscape? I encounter no bugs on daily basis, if at all.
No you're not. This thread feels like a nostalgic crowd who idealizes the past and falls into the “it was better in the old days” trap, letting go of the myriad of things that improved since then.
Look snow leaopard actually added the hugest feature ever, grand central dispatch. That’s what billy always dreamed about (concurrent windows) and that is what rust craze is about, adding the same to Linux/windows. Like Apple said, snow leopard was under the hood changes. So don’t you worry you will get your snow sequoia, the ai reorganisation is exactly that.
How does Apple Music not have an equivalent to Spotify Connect?! Renders Apple Music unusable, and no, we're not talking about Airplay, and no, we're not downloading iTunes Remote (can't believe it still even has 'iTunes' in its name!).
> I could walk item by item through System Settings and point out many equally inexplicable decisions. Did anyone at Apple really believe a Mac user’s life would be better if common features were buried deep in menus?
I have to agree with this, System Settings seems very inconsistent (design) and has terrible information architecture / organization.
Use it every day and still no idea where anything really is in there. What a shitshow.
Speaking of, I just bought a brand new M4 Air. The thing is amazing, except I swear that Command-Tab does not work consistently sometimes, it just does nothing and I have to press it again. It's baffling, has anyone had this before? Never had this issue on any computer in the past 20 years, it's strange.
I noticed that too, seems that since Sequoia a slight delay was introduced, so you need to wait a bit after pressing alt before pressing tab.
No idea if it’s related to the new double tapping of alt to open the siri text input.
Been looking for a windows replacement and probably will just stick to some linux distro. I had hoped Apple was better…
Linux Mint is quite good, I've got it on a few machines. So far it's the most stable, easily-updatable and "gets out of the way" Linux distro I've used in years.
Another example was High Sierra. They completely swapped out the file system on that release, focusing primarily on under-the-hood changes, and imo was also one of the most stable macOS releases to date.
I was thinking of that the other day. I think I stayed on High Sierra as long as I could.
It should be noted that Snow Leopard was pretty buggy until several versions in.
Our memory is a lot rosier than the reality.
Everyone keeps making this comment but the article’s about the idea of a maintenance release more than about Snow Leopard. It was a good idea that’s stuck around in the dev community, something we’ve been talking about forever, so it’s basically a meme at this point to say “Snow Leopard release”
I understand, but my point is that it took a lot of time for Snow Leopard to reach a level where it lived up to that.
Starting to really look like building different enough to be incompatible OSes for each of your products was a bad idea.
Do we really need MacOS, iOS, iPadOS, WatchOS, TVOS, VisionOS, does maintaining all this make the product better kind term.
Open Launchpad and then try to click the spotlight icon in the menubar after. Even after returning to the desktop the system won’t allow me to click its own menubar. Reproducible on all machines, drives me insane.
I don’t use pretty much any of the features he listed.
But Sequoia has made some M1 Pros run poorly in my environment. It’s unacceptable the amount of resources it takes to do basic stuff that we got right of 30 years ago.
Snow Leopard was released while Steve Jobs was on medical leave. It was driven (as far as I can recall) by Bertrand Serlet. Rumour has it that Steve was furious about the "no new features" marketing when he returned from his medical leave.
Is Bertrand still kickin? Can Apple poach him for a few years to clean house? I miss the days when he was running the software division at Apple.
This might be the main issue with software at Apple getting so much worse. Bertrand knew what he was doing. Apple's (and NeXT's) OS used to be an OS, not a collection of toy apps.
Sigh. I don't get the sentiment and the whole debate here. The author is clearly nitpicking (he is the first person who uses messages after all). But honestly, complaints about "arrange" screens button?
Nevertheless, he is probably right. Only the people who went through working on Windows, Linux both on cheap and expensive machines while dealing with all the "baggage" these environments bring can tolerate MacOS with leniency. I will never come back to anything else until I see a competitive offer from just anyone because what Apple offers is:
* Fast, silent, extremely energy efficient devices with excellent screens and audio.
* The font rendering. I honestly can't believe people who professionally work with text all their lives never mention it here. MacOS had and continues to have the best fonts and font rendering that is.
* Solid build that lasts (I own MacBook Pro and MS Surface Book 2 both from 2019 so I see how they age).
* A device that is ready to work when you open a lid or touch a keyboard button without any "waking up from sleep/hibernation" or freezing due to buggy video drivers and inability to work with GPU in hybrid mode OUT-OF-THE-BOX in 2025.
The above-mentioned is more than enough for me to tolerate any MacOS issues and the ones mentioned in the article are just laughable.
Apple offers you the full package that allows cross-device integration while Win/Linux users still rely on the Google stack or other third party "workarounds". Yes, no surprises here -- owning the hardware and software stack is a massive advantage.
> Solid build
Yeah, until the flex cable breaks (a 5 USD part) and you're forced to replace the entire screen (for 1000 USD).
I'm 100% supportive of Framework laptops, especially if it can be an open standard with aftermarket parts.
Apple products have good quality, but I'd prefer to upgrade just the CPU and keep my old display. Hopefully Framework will work towards that.
The Apple model is wasteful and profit-engineered.
> * The font rendering. I honestly can't believe people who professionally work with text all their lives never mention it here. MacOS had and continues to have the best fonts and font rendering that is.
Linux has significantly better font rendering than macOS these days if you're on a 140 or less PPI screen. Linux still does subpixel AA and text looks razor sharp, while Apple pretends very large monitors like my 140ppi 57" don't exist.
Power Mac G5 systems sold in 2006 were abandoned by Snow Leopard in 2009.
Apple could conceivably abandon intel Mac Pro systems sold in 2023 by releasing an Apple Silicon-only macOS in 2026, but three years still seems a bit aggressive.
Great post.
BTW, there is an (earlier) example of Snow Leopard in the Microsoft ecosystem -- that would be Windows XP, which similarly avoided major new subsystems and new applications built-into-the-OS, but was remarkably fast and stable for its time.
XP was a massive change in that it was the merger of the Windows NT and Windows 9x lines.
It was perceived as bloated because it struggled on the hardware of the time.
Then it needed a near total rewrite with SP2 because it was riddled with security issues.
I think you mean XP SP2?
I never tried the pre-SP XP, but even SP1 wasn't too terrible compared to what it was competing against (there was a lot of perfectly-usable 95/98/ME still around with their lack of privilege separation, and 2K was mostly better only if comparing at the same amount of RAM but XP machines were newer and often had more).
For me, service packs were largely a question of "do we want to tie up the phone line for however many hours just because Microsoft wants to rearrange a UI layout?"
I guess it wasn't very obvious, but SP2 had a ton of security enhancements.
Apple needs to restore primacy to the UI. MacOS and iOS used to feel non-blocking with a UI that would always respond regardless of how long a remote or long-running background task required.
Now iOS and MacOS feel sluggish and slothlike, waiting on IO, typically from a remote call. The webdevs have taken over.
Yes they need to remove cruft, and also re-hire the ruthless UI Nazis who would enforce 120hz responsiveness at all cost.
Couldn't agree more. Moving a file from one folder to another has a huge delay. Dragging files for spring loaded folders doesn't work well anymore.
As a user since System 7 it's so sad to see.
i respect a fellow UI autist
I've said this many times, snow leopard is still my favourite OS today. If you could add iMessages to it, although not necessary, it would be perfect.
Of course today it would be insecure, missing security patches etc. SSL...
Can a publicly traded company be sued if they allocate more resources to QA? Could an activist investor argue that cleaning up Mac OSX is a waste of time because people will buy the computer anyway?
Bug fixes and stability don't sell. That's why what we will receive this year is iOS redesign instead of bugfixes.
Ahh Apple Vision Pro.
I entirely forgot it existed! They still sell that?
With appointment .
Not sure.
They apparently do sell it in my country with a price of a good used car. Nope.
I long for a modern NeXTStep-like OS. A polished, consistent, solid operating system that is lean, clean and simply focused on getting things done. It should be predictable in every way and never get in your way. None of this SwiftUI bullshit, Animoji, AI or blurry UI. sigh
Curious what's bullshit about SwiftUI?
From a features perspective, they should acqui-hire either Alfred or Raycast and build that functionality into spotlight.
What's frustrating is that Apple still has the resources and talent to ship rock-solid software
How much better would life be if KDE ran on Mac?
Maybe they just need to stop doing a major OS release every year.
Apple needs to admit, a product is either new or improved. It is never new and improved.
I'm enjoying Sorbet Leopard on my 20 year old Dual Core PowerPC tower. Mostly just messing around with old versions of Max making weird sounds.. but when I do interact with the OS it feels great and responsive and a joy to use. Modern MacOS can feel that way if you turn off a lot of crap. I don't even sync my accounts to the OS anymore.
The first thing I do when I reinstall macOS is to disable most of the ”features”, services, and apps Apple added over the last decade. I can’t imagine how cluttered my digital life would be if I’d depend on all those useless toys Apple stuffed into the OS and abandons a few years later (looking at you, Dashboard).
My initial wish for Apple was to make macOS as bulletproof, lightweight, and bug-free as possible. But now I just want to use Linux on my M1 MacBook because of all the bullshit that’s going on in the US right now. It’s only a matter of time until the Trump administration will start to dismantle the American technology sector, beginning with the softening of encryption and the death of Advanced Data Protection I currently rely on on iCloud. Mark my words.
Like I’ve said in a couple of comments before in other threads, I’d love to switch to Asahi but without native disk encryption I just can’t. If my laptop gets stolen, all my files would be visible to the thief, and that’s a risk I’m not willing to make.
Apple should consider a LTS MacOS version, like RHEL.
The myth of Snow Leopard is strong (while in reality a lot of fundamental things people still complaint about weren't fixed), so Apple can just as well do nothing better and hope a new myth will emerge sticking to some other current name...
what are some good alternatives to mac os? there some features like image/text copy-paste being cross device that are insanely useful that make it hard to switch
>what are some good alternatives to mac os?
There are three main choices and they are all compromised in their own way. You just need to figure out what is important to you and what isn't.
What you shouldn't do is take too much notice of posts like these, I've read through the whole thing and haven't had any of the issues mentioned. I've also not seen a mention of the issues I do have. HN has a negative tone, it seems we like to whinge.
Apple is becoming like Google, everything is slightly broken and nobody cares, because fixing stuff doesn't get you promoted...
I think we're at the point where it'd be better if OS's were just a thin platform, and the updates to user facing features came piecemeal to different apps instead. IE, update Finder or Safari but leave the core functionality alone outside of bug fixes or very rare major upgrades. I'm so sick of having to update my OS every year.
Apple is annoyingly doing an AI sequoia.
They neither have the financial and time capacity left required for high quality, nor do they have the engineers and management to enable it anymore.
„Just get it out somehow.“
„Fixing bugs is not a KPI for our promotions and salary increases.“
Old stuff is practically abandoned. No one knows how to fix it anymore and it’s replaced instead, at best. Disdain for legacy. The only thing management gets excited about is the next shiny thing, currently tacking AI onto everything.
Can you name big companies where this did not happen?
iPad OS is one of the worst operating systems in widespread use.
I clicked this because I was confused why someone felt so strongly about Apple needing a winterized SUV.
No AI please for the love of the spaghetti monster. I'm so sick of having this shit shoveled into anything I'm trying to do these days. Disabling Siri all these years was bad enough.
So far Apple has kept it as a toggle in the settings, but it's easy bloat for it to keep spreading. Does anyone need AI in a text editor? No.
While I appreciate the sentiment, I think the single best use case for LLMs today is drafting text, so a text editor sounds like home for an AI assistant.
Where I fall on this is. "What is the tool for?" I still default to nano in the terminal for basic editing. Him and eMacs are entire ecosystems when all I need is a chisel.
In the general sense, notepad and TextEdit should just be less nerdy nano's. They always have been and that's what they were meant for.
If you need something to write reports, a book, etc then you use MS Docs, Google Docs, or whatever Apple provides. Those are the tools where adding AI might be useful as a feature, like the ribbon in Office.
Let purpose made tools just be that.
Yawn, there is some variant of this story after every os release.
The articles specific gripes with macOS are Mail, Messages, and System settings. Fixing those does not require a ‘no new features’ (which was always BS) major release.
I would love to hope that articles like this one could move the needle at Apple, but I'm not holding my breath.
I've been saying this for YEARS. In fact, I just published a blog post saying this very same thing SMH. https://blog.webb.page/2025-03-27-spitball-with-claude.txt
Ok, but can we please call it "macOS Permafrost"?
I am totally invested in the Apple ecosystem, which on principle I'm against (closed systems never sat right with me), but at the time (beginning ~2015) the products and services were so well integrated and genuinely improved my life it was hard to see how things could ever get this bad. I'd still never (ever) go back to Windows, and Linux doesn't have the same feel or ease of setup as macOS, but I am genuinely, deeply concerned about this trajectory for Apple. Albeit super opinionated, but I feel that macOS was the saviour of modern aesthetic computing especially when Windows started its rapid decline post 7. I’m fine trading some frustration—like extra steps for untrusted software—if it keeps macOS secure and fast, free of Windows-level adware or telemetry. But right now, macOS has never been in a worse state.
I recently emailed Tim expressing the same concerns as the article and regarding specific issues with Messages and Mail resource usage and was surprised to get a response from Craig requesting more information and sysdiagnose files, but this is where feedback ended unfortunately.
The current state of the macOS UI is atrocious, devices don't all need the same button shape or menu UX flow across all devices as they are inherently interacted with differently. A Mac isn’t an iPad — why force the same rounded buttons and simplified menus on both? They’re interacted with differently: keyboard and mouse versus touch. I have no idea why this is so difficult for execs to understand or important for them to change. Software teams at Apple are so lucky to have the Apple Silicon innovation on the hardware side, Intel Macs would catch fire on boot-up running any of the latest releases given how atrocious the resource usage is.
While I'm here whinging, the iOS swipe keyboard is garbage (almost totally unusable now) where before it was perfect with the innovative predictive hit-box expansion pioneered by Ken Kocienda. I think that's now been replaced with AI prediction which in 2025 I don't understand why it can be so embarrassingly bad. I had to upgrade to the iPhone Max recently to hit the letters properly. Also Apple I never want to tell someone to "duck off".
Initially I was understanding, but quite frankly now I'm just pissed that it has gotten to this stage, and there is no indication of resolution from execs about these issues.
I’m starting to worry that Apple could go off the deep-end - the way of Microsoft - coasting on hardware sales while letting software quality slide (albeit seeming intentional from Microsoft's side of the fence). I get it — software isn’t where the money is, hardware drives the business - but the two are inseparable BY DESIGN. When macOS struggles with basic functionality, it undermines the value of the Mac itself.
Author calls out some truly irritating defects, and Messages is rife with them. But there are bigger ones in that application on both Mac OS and iOS.
Topping the shitlist has to be the inexplicable splitting of group threads for random people in the group, even when everyone is using an iPhone. Suddenly someone in the group gets the messages by him or herself and can't reply to the group. And this occasionally also happens in one-on-one threads: I've had years-old (maybe decades-old) threads suddenly split off into a new one with a friend of mine for no apparent reason.
There's some fundamental incompetence in Message's design, and I'm sure that the addition of RCS has made it worse because it was slapped onto a rotten core.
Oh yeah, then there's the way Messages (or, to be fair, iOS) loses all of your contacts' names if you travel outside the country. This is another brain-dead defect: Just because you're in a new country code, your iPhone suddenly can't associate U.S. numbers with your contacts. How the hell does this go unfixed for one major iOS revision, let alone 15+ years?
Oh yeah, then there's the way Calendar "helpfully" changes the times on your appointments when you travel... meaning that you'll miss all of them if you travel east, because your phone will move them hours later. I mean... who lives like that? I you're going to London on business and the next day you have a meeting at 10 a.m., your iPhone will "helpfully" change that meeting to, say, 5 p.m.
So when the author muses about whether Apple developers ever actually use this stuff in the real world, the only logical answer is no. Or they just take so little interest in the functional quality of their product that they just check in some grossly defective trash and call it a day... and refuse to fix it year after year.
Or... they're not given time and resources to fix it. I'm pretty gentle when filing bugs about Xcode, because I'm sure they are understaffed. But at this point, the neglect has (or should have) exhausted every developer's patience.
Which brings us to a bit of hypocrisy in the post: "Apple is clearly behind on the AI arms race"
NO. Apple's sad capitulation to armchair "industry observers" and "analysts" has contributed greatly to the very defects the guy complains about. Apple should not have jumped on the "AI" hype in the first place. It does not serve Apple's product line or market. They are not a search company or gatekeeper to huge swaths of the Internet. If they wanted to quietly improve Siri and get it RIGHT, fine. But now they're embarrassed, and resources that should have been spent on QA have been squandered on bullshit "AI" that failed.
Snow leopard was, as you said, necessary in anticipation of the architecture change.
Now there's no such change, but instead AI, this weird new cross-cutting but fuzzy function touching everything that no one has ever used reliably at the scale of Apple devices. AI is impossible to reliably test, and all-too-easy to get embarrassing results. I'm glad Apple recently tamped expectations.
The relatively loose concurrency model in Apple's ARM has made it rival the network in introducing new failure modes Many quality issues cited have their root causes in those two sources of indeterminacy.
Amplifying these are the organizational boundaries driving software flaws. Siri as a separate organization with its own network-dependent stack is just not viable for scattering AI. Boosting revenue with iCloud services makes all roads run through the servers in Rome, amplifying network and backend reliability issues. I also suspect outsourcing quality and the maintenance of legacy software has reduced the internal quality signal and cemented technical boundaries, as the delegates protect their work streams and play quality theater. The yearly on-schedule cadence makes things worse because they can always play for time and wait for the next train.
And frankly (to borrow a concept from Java land), Apple might be reaching peak complexity. With hundreds of apps sporting tens of settings, there is simply no way to have a fast-path to the few things different people need. Deep linking is a workaround, but it's up to the app or user to figure that out. (And it makes me livid: I can't count how many important calls I've missed by failing to turn off "Silence unknown callers", with the Phone app settings buried 3 layers deep ON MY PHONE)
A short-term solution I think is not a rewrite but concierge UI setup: come to the store, tell the "geniuses" exactly what you need, and make shortcuts + myUI or whatever is necessary to enable them to make it happen. Then automate that process with AI.
That's something they can deliver continuously. Their geniuses can drive feature-development, and it can be rolled out to stores weekly and -- heavens! -- rolled back just as quickly. Customers and employees get the excitement of seeing their feature in action.
The model of sensitive form-factor designers working in quiet respectful collaboration to produce new curves every year is just wrong for today's needs. All those people standing around at Apple stores should instead be spending an hour or more with each existing customer designing new features, and they should be rewarded for features that take, and especially for features that AI can incorporate.
On the development side, any one should be able to contribute to any new feature, and be rewarded for it. At least for this work, there would be no more silos, and no massive work streams creating moral hazards.
The goal is to make software and a software development process that scales and adapts. It may start at 5% of new UI features, but I hope it infects and challenges the entire organization and stack.
Granted, it will take a famously hub organization and turn it into a web of hubs, but that in itself may be necessary for Apple to build the next generation of managers.
Look for how today's challenges can help you build tomorrow's organizations.
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[flagged]
a) might just be a quick cartoon
b) it might well be that the author really wishes they could draw a cartoon but can't, and resorted to AI to convey their aesthetic choices
I think your pessimism is unwarranted--AI illustrations do have their use.
Apple has gone from Company I loved to the one I hate! They are the new Microsoft! They have hired a bunch of idiots in their security team who are driving their user base insane! They can completely lock you out of all your devices with no recourse! I am starting to move away from this pathetic company”s products!