What OS for my New Desktop?

[Written July 12, 2023; updated August 4, 2023]

My current PC was built in 2012, and will turn 11 years old next month.  This is by far my best run of hardware; it’s still about 50% original parts.  Most of the core is untouched; it has the original motherboard, CPU, RAM, CD burner, and case.  The HDD was upgraded to SSD (more below), I added more RAM in the second channel, and I put in front USB 3.0 ports in the 3.5” bay.

I would like to get another 10 years out of my next setup, whether it’s a pre-built replacement (as the 2012 system) or a custom parts-built system.  There are two simple, intertwined reasons to do this: anti-consumerism and environmentalism.  Every time I click “Buy”, I’m asking the Destroy Earth machine to extract resources, produce a new thing, and process the e-waste of the old thing (and even “recycling” extracts more resources to do that.  There is a reason that “reduce, reuse, recycle” puts it last on the list.)  Every time I click “Buy,” I’m bound to my job for the requisite number of hours to pay for it.  In some sense, the more often I buy, the less freedom/runway I have in case of emergency.

The nightmare scenario is that desktops become like phones or fast fashion, where people think they’re “old” after a couple of years, and waste even more.

But before we consider what it would take to get 10+ years from a system, let’s look at where I came from.

The Dark Past

Previously, I had a machine that went through so many transformations, it was a Ship of Theseus🌎.  It started as a 600 MHz AMD Duron in October 2000.  By the time I decided on a full replacement, just in case electrostatic damage was affecting it, the only original part was a floppy drive.  CD burners got faster, hard disks got bigger, RAM kept raising clock speeds, and SATA came.

But the thing that really pained me was to think of all of the upgrades I had done, and how infrequently the performance felt like it was improved.  Even if I could measure improvements in compile times on Gentoo, the only upgrade that really had impact at the user level was the Barton core Athlon XP, which made the later shift to dual-core AMD64 underwhelming.  I totaled all the parts that went into it; I had spent at least $3,000 between the original system (thanks, Dad) and my “upgrades”.

And replacements.  Near the end of its life, it wouldn’t power on.  The final incarnation was based on a Phenom II X4 (Propus), just to keep it alive until Ivy Bridge.

The Ivy Bridge Era

I had eagerly waited for Ivy Bridge (power efficiency improvements over Sandy Bridge) to order a midrange CyberPowerPC desktop from NewEgg, then I made it quieter.  Noctua fans, Nexus hard disk isolation, Zalman CPU cooler, and I transplanted the Seasonic PSU from the old system.

(The PSU may have been a mistake.  3 video cards from 2 vendors have died in this system; it runs on integrated graphics now.  Maybe the cards are just fragile, or maybe 330W is not able to feed everything with a 75W card plugged in the PCIe slot.)

Software-wise, I wanted to do a lot of ROM hacking, and many of the tools for that only ran on Windows.  I thought I might need After Effects to post-process gameplay videos.  If I wanted Linux, there was VirtualBox; so Windows 7 was the obvious choice for OS.

Windows 10 started doing so much “compatibility telemetry” that it made the system unusable for a full 5 minutes, or maybe 10, after logging in.  I finally caved three years later, and got the SSD just to paper over the fault in Windows.  What is it even doing, anyway?

USB 3.0 (aka USB 3.1 Gen 1, aka USB 3.2 Gen 1) went from novelty to common enough that I wanted front ports, so I filled the 3.5” bay with ports.  I could have put in a card reader, but they also had a tendency to break over time (not only in my own PC), so I went with an external reader and all-purpose ports.

Hardware pricing dropped a lot between 2000 and 2012.  I paid much less for the system, then I put fewer upgrades and replacements in.  The additions total just $437, so although I don’t have the original invoice or the pricing for the video cards, I think the all-in price is around $1,400.

Next…?

Windows 10 has been rather obnoxious with advertising.  “Finish setting up your PC”?  I already did!  Please stop asking me to switch to your inferior alternatives; I like Firefox, DuckDuckGo, and LibreOffice just fine, thanks.

Windows 11 raised the hardware requirements to “CPUs born yesterday.” Supposedly, it was impossible for Microsoft to support chips that did not have that level of features… unless they happened to be in the Surface devices that Microsoft was selling.  So, not impossible at all; Microsoft just wanted to transparently lie to us.

The deeper problem is that I don’t know if a CPU purchased today will be supported in 10 years by “whatever the next Windows is.”  Before Windows 11, the hardware support at release was sufficiently generous that I would never have worried about it.  (Although, the first clouds appeared with the Atom systems that were removed from support on Windows 10.  At that point, keeping them on 8.1 would have resulted in a longer supported life than upgrading!)

What about Linux?

I ran Linux (and FreeBSD) as my primary desktop on the Ship of Theseus.  From Red Hat 7.0 to FreeBSD 4, Gentoo, and Ubuntu (2007–2012).  I’ve continued to use Linux on the server for work and home (this site!) since 2012.  So I have a lot of perspective when I say…

Open Source is always almost there, and when it gets working pretty well, someone comes and churns everything.

Your video driver will randomly be dropped.  The “old” one won’t support newer cards, the “new” one won’t support older cards, you might end up with a gap in the middle where there is no support for the card, and the “new” one won’t even support new cards well for a few years.  (Some of this happens because of legitimate architecture shifts in cards, to be sure, like the rise of AtomBIOS.  But with Windows, every card has a supported driver from Day 1.)

Your sound system will be replaced without warning.  With experience from sound daemons and OSS, they will build ALSA, and it will support your card, but you won’t know it because you didn’t unmute.  Ha ha!  ALSA isn’t sufficiently low-latency, so they’ll write JACK, because adding a component is obviously the way to make an existing component faster?  Then PulseAudio.  Then pipewire.

Your filesystem is bad, and you should feel bad.  Btrfs is the future!  It’s production ready!  Its website says it’s experimental and never to use it in production!  This will go on for years.  Eventually, the website won’t say it’s experimental anymore, long after the ceaseless chase for shiny things has worn you out.

Your graphics stack is hopelessly outdated!  This other one is the future.  And it’s definitely better than that other other one.  We forgot about accessibility, oops, it’ll take us 7+ more years to finish that.  At least it’s secure.  But your graphics stack is still outdated!  Just switch today, when things barely work!

Gnome will choose bad defaults, have people file an issue, and the maintainer will respond, “But I don’t see why this thing I don’t use should be improved?” This will languish as Open in the issue tracker for years.

Apple?

On the surface, this would seem like a good choice.  Unless there’s an architecture transition, the OS is supported for a good number of years, then semi-supported; even after that point, it’s likely that Linux could run on the hardware.  That is, if Open Core Legacy Patcher or equivalent no longer exists.

The interface itself isn’t too awful to use (given a full decade, and enough copying by Linux, the old dog can learn to at least live with the Cmd+Tab/Cmd+Grave split.)  Some parts of it, like pinning apps to spaces, is unparalleled.  Keychain is notably more secure than the Windows or Linux equivalents🌎, because it tracks access per-app, rather than globally.  Only macOS is interested in stopping malware from stealing the Web browser’s secrets.

On the other hand, the removal of various support stings.  I used to connect a second monitor (a TV) over HDMI, until Apple decided to always use overscan with it, removing the option to have it display correctly.  That hid the menu bar, and made zoomed/fullscreen windows much less usable.  I had to buy a Mini-DP to VGA converter to fix it, making the original cable useless for no reason after just two years.  And I put in a lot of effort to downgrade macOS🌎 to try Target Display Mode (it didn’t work; wrong cable) because that had been removed in later versions.

There are also the problems with the 30% cut.  I am still deeply offended by Apple arguing that an app with a “Buy” button should mean the publisher owes Apple for any purchase made there.  Even though Apple may have nothing to do with the relationship, hosting the store, processing the payment, or fulfilling the order… and they even get paid for the app to be written!  It’s like leasing a space in the mall, but the cost is 30% of your revenue, and instead of using a pre-made space, you had to add your own shop onto the mall.  It’s completely absurd.

Logically, I understand that our post-1980 corporate philosophy means companies must strive for power and rent-seeking.  If they don’t, another one will.  But fundamentally, it’s still brazen thievery.

So really, what’s next?

Since I don’t have a clear picture yet, it’s time to rationalize!

Most security?  There are 3 tiers: macOS, OpenBSD, other.  I don’t know if OpenBSD can fix all weaknesses singlehandedly, but at least they are trying on some level.  Like introducing pledge and getentropy.

Most longevity?  Linux and macOS are the top contenders.  We’re in a place where it’s a toss-up in Windows; if the future version keeps the same CPU requirements (even if it takes out the Surface exceptions) it’ll be fine, but if not?  It seems like this angle is mostly a gamble: any hardware I buy will likely be able to run its “standard” OS, and Linux by the time that support window has ended.  I don’t know if AMD64 is going anywhere, but I don’t know what ARM’s future prospects as “desktop machines” are.

Most tolerable?  Linux is free of the greed driving poor decisions at Microsoft and Apple… but personally, Apple has the best UI.  Unlike when the first M1 launched, there are clear virtualization options🌎 now, which means “a different CPU architecture” is probably not a critical problem.

Smartphone integration?  It’s Apple’s explicit strategy to maximize features between iOS and macOS… which would be irrelevant, if they didn’t also make the world’s only phones that aren’t subject to 3rd-party meddling.  Letting carriers interfere with Android is a crucial weakness in the platform, despite how necessary it was to get acceptance.

Power efficiency?  Apple happily shares a 7W figure for the M2/M2 Pro idle, while Intel and AMD are not forthcoming.  Searching the internet suggests that the competition fails to meet (Intel) or exceed (AMD) this level.  At the top end, the M2 is said to take 50W max and the M2 Pro 100W.  AMD has the Ryzen 7600/7700/7900 options at 65W TDP / 88W max, and uhh.  Yeah.  Intel has thrown watts at their problem and their “65W” i7-12700 maxes out at 180W and the i7-13700 at 219W (PL2 values).  So this seems to rule out Intel, while not offering a definitive AMD-vs-M2 answer.

In conclusion, Apple hardware seems to be at the top overall.  This is not what I had hoped for.

Revision History