Skip to main content

Alys+

You like actuallyalys. You've always liked posts. Don't you deserve the smooth, luxury takes of...Alys Plus?

🖙A Lot of Statements about Software Speed and Quality are Just Vibes

written by alys on

It's common to hear that software today is in a dire state. I'm not going to disagree—when I came back to my computer to add to this post, Firefox had stopped accepting keyboard input, with keystrokes appearing on screen sometimes 30 seconds or more after and sometimes not at all. However, I do think it's hard to nail down whether it's gotten worse.

One semi-famous rant in this space is Casey Muratoni's complaining about Visual Studio slowing down over time, particularly in its watch window, which apparently no longer lets you rapidly step through. (You can see the actual demo here) While I don't doubt that it used to be faster, I'm not sure how representative Visual Studio is.

Often the most rigorous measurements, e.g., Nicholas Nethercote's "How to Speed Up the Rust Compiler in" posts (here's the latest), show things getting faster. Obviously, this is just an effect of the fact that the people paying closest attention to performance are often optimizing their own performance and also that if your performance optimization efforts fail, you're probably not going to post your measurements. No company or open source maintainer or solo dev or hobbyist is going to publish "My Software Got Really Slow between 2020 to 2024 Because I was Asleep at the Switch". Still, it is kind of funny how divergent vibes and data are here.

Some of the most rigorous and broad measurements showing things have gotten slower that I'm aware of are in the form of webpage sizes. Obviously size ≠ speed, but the size increase is drastic enough to offset a lot of improvements in other areas.The HTTP Almanac showed a nearly 600 percent increase in size from June 2012 to June 2022 for mobile pages and a 221 percent increase for desktop pages. Their 2024 report shows the average weight of pages continues to increase. This is genuinely a problem!

Another set of pretty rigorous measurements is Dan Luu's measurements of latency, such as "input lag", written in 2017. He's cautiously optimistic:

On the bright side, we’re arguably emerging from the latency dark ages and it’s now possible to assemble a computer or buy a tablet with latency that’s in the same range as you could get off-the-shelf in the 70s and 80s.

So there are some areas where there are genuine problems, but a more mixed picture overall. I think one could fairly ask why performance is dipping at all, given the enormously faster computers we have. Even if the end result is that software is, say, only 5 percent slower overall, why do we have to tolerate even that, when hardware is so much faster? Still, "software is much slower than it could be" is quite a bit different than "software is much slower than before."

Quality is even harder to nail down.

Having done some retrocomputing lately and a lot of reading about computers and software of the 80s and 90s, it is a bit funny to think of software in the 80s or 90s as being better than today. MS-DOS was very prone to crashing and while Windows 95 and 98 made improvements, they weren't known for their stability either. I know little about Apple computers of that era, but my impression is that the later incarnations of the Classic Mac OS operating system were showing their age and also pretty buggy. Offhand, the only stable operating system of the 90s by reputation I can think of is probably Windows 2000. That made it barely under the wire to count as "90s," being released December 15, 19991, but NT 4 probably qualifies as well.

Realistically, I'm sure there were other solid OSes then. I haven't heard good things about desktop Linux in the 90s2, with the exception of Neal Stephenson's "In the Beginning Was...The Command Line," who, via analogy, dubiously claims Linux computers never crash. While I find that very unlikely, that was when the LAMP stack started to gain popularity and Apache was popular not long after its 1995 release, so I'd guess Linux servers at least were reasonably stable. Apparently Red Hat already had its IPO and reached a market cap of $5 billion in 1999. Presumably commercial Unix OSes and IBM's mainframe offerings were competitive with Windows NT and Linux then, but I don't know much about them.

I don't know a lot about BeOS, OS/2, or NeXT (the last of which became a very well regarded operating system), so it's possible I'm short-changing those. I've heard good things about BeOS and macOS of the time, but those were for their user friendliness. I'm kinda glossing over that, but it is important. I mean, a completely bug-free implementation of ed that runs at lightning speed wouldn't be very useful to most users today. I think you could make a strong argument that flat design and being able to A/B test and constantly tweak stuff were really bad for usability, and software usability has genuinely gone downhill.

At risk of short-changing some operating systems, it seems that, in the 90s, your choices for stable operating systems was one that you had to pay Microsoft for (which wasn't even as good for playing games) or a few obscure OSes you had to pay IBM and a couple others a lot for and were mostly meant for servers and mainframes. The other operating systems had their own merits, to be clear, but I'm talking about operating systems that had some stability features that we take for granted nowadays.3

Still, maybe software peaked in the 00s instead of the 90s and has since stagnated generally, or has actually gotten significantly worse. The 00s was an era where everyone had the benefit of protected memory, meaning programs couldn't fuck each other up (or fuck up the OS itself) anymore (Well, mostly) and other problems, like malware and buggy drivers, had mostly been reigned in. And perhaps that's early enough that the pernicious trends that Ruined Everything hadn't really taken hold.

Well, maybe! Like I said, it's tough to nail all this down. Webpages were certainly smaller then, and the trend of having tons of tracking scripts on pages hadn't really emerged yet. So "webapps were better in the late 00's/early 10's" seems much more grounded and plausible to me than "software in general was better in the late 00's/early 10's."

Mobile apps are kinda interesting in light of this. I'm genuinely pretty content with the quality of most of my phone apps on iOS. I'm not sure they crash less than desktop apps, but they start really quickly and tend to be better at picking up where they left off. Of course, I have the benefit of good 5G coverage where I live, so I'm not as affected by cases where the programmers assumed people would have reliable internet 24/7. And while I don't have the latest Apple phone (and never have had the latest smartphone), I do have a new-ish iPhone (albeit an SE), so I'm definitely not as strapped for processing power as many people.

The other thing to think about is that people were complaining about this for a long time. Nicholas Wirth complained about this in 1995. I also found people discussing a similar idea in 2009, although note the number of skeptical comments.

While I'm not out to prevent anyone from venting, I do think if we actually want to make things better, we should be more grounded and step away from the nostalgia and hyperbole. There are valuable techniques and tools that from today, there are valuable techniques and tools we could resurrect, there are valuable techniques and tools still to be invented. We should try to draw from all categories if we want to make software better for our users.


  1. In retrospect, releasing a new operating system 16 days before Y2K is a bold move. Apparently, at one point they were recommending "buy Windows 2000" as part of their Y2K strategy, which to be honest sounds like something Microsoft would do today.

  2. although i'm sure people had a lot of fun tinkering.

  3. Similarly, you could say that in the early 90s your only choice for multitasking on an IBM compatible was Windows 3.1 and OS/2 or in the 80s your only choice for a solid GUI was a Mac. Obviously, "supports a GUI" and "can multitask" are completely trivial, things you expect out of a Raspberry Pi.

1624 words; 57 sentences
Stats for nerds
  • 1624 words
  • 57 sentences
  • 28.49 words/sentence
  • Flesch-Kincaid Reading Ease: 11.65 (grade: 12)
  • Dale-Chall Reading Ease: 9.66 (grades: college)

posts from friends

Review: The Stardust Thief by Chelsea Abdullah

My review of the fantasy novel The Stardust Thief by Chelsea Abdullah. It was published in 2022 by Orbit.

via Nullrouted Space December 17, 2025

(watching an old movie)

interesting...it seems beer was served omakase in 1980s american bars ]]>

via topposts.net December 15, 2025

new music roundup: june 2025

hey friends, gonna start this one off with a request: i’m currently without stable housing and without employment, and i’m in extremely dire financial straits. as of this writing my account is overdrafted and i have very limited options for fixing that. i…

via BLOOD CHURCH June 6, 2025

Generated by openring

alys