🖙Programming in the 20th Century
Programming in the 21st Century is a blog by "recovering programmer" James Hague covering a variety of topics, including functional programming, functional game programming, and how programming differs in, well, the 21st century. I revisited it lately because I was doing Doscember, and the author frequently talks about that era, so I thought it was good to have some perspective before doing retrocomputing.
One thing I really appreciate is that the author seems to strike a good balance between leveraging the perspective of someone who was around for the 1980s era of computing, when there were tight limits on computer resources, while appreciating what's available now and what it lets you accomplish. I have plenty of frustrations with modern software, including its performance, but I'm nonetheless frustrated by simplistic takes dunking on, say, using a garbage collected language1 or "today's programmers." Whinging is understandable but these specific tropes are counterproductive.
Worse, some of these discussions devolve into fetishizing efficiency or low-level programming. There's nothing wrong with an intense focus on optimization in contexts where that's necessary or even enjoying it as a challenge or learning exercise—what I object to is acting like an obsession with performance or low-level programming is necessary to be a good programmer or necessary for good programs.
I've actually been playing around with Turbo Pascal specifically and posts like Things Turbo Pascal is Smaller Than. Reading the comments on lobste.rs, it seems like people interpret it as trying to criticize the items on the list? I don't interpret it that way, frankly. I see it more as a statement of what could be done in that amount of space and the constraints that made it necessary. In that way, I definitely appreciate what Turbo Pascal, a ~40 KB binary, can do. But the idea that larger programs are some sort of hugely superfluous waste of space is missing the overall point.
In actually using Turbo Pascal, I also notice what it can't do and the tradeoffs it made. Turbo Pascal 3.5 shipped the text of the error messages as a separate file in order to allow programmers to delete it and save 1.5 KB of disk space and memory. It was good that they put the work in to do that, but thank god we don't need to do that anymore. Turbo Pascal 5.5, which I've been using, embeds the messages directly, but they're still pretty sparse. I guess it's not completely out of the question that they could cram in the kinds of detailed messages present in Rust (or are being increasingly added to Python) and the accompanying logic in a binary roughly the size of Turbo Pascal 5.5's. I suspect the design choices that make Pascal quick to parse probably would give such a project a fighting chance. However, it would be tough.
You also can see the limits of this era even with clever optimizations and plenty of implementation drudgery in posts like A Spell Checker Used to be a Major Feat of Engineering. He points out that /usr/share/dict/words is 2 MB. Some systems didn't even have that much total memory! It's not enough to just drop the uncommon words, either. Thus, rather than spell checking being something you just drop in or get for free from your browser or OS, it's the major feat of engineering from the title and takes time away from other features.
There's actually a page on the Free Pascal wiki, Size Matters, that suggests to me this fetishization can become truly unsparing. Free Pascal seems to produce binaries that are consistently larger than Turbo Pascal did2, but these are still a reasonable size, in the hundreds of kilobytes or handful of megabytes. To pick an example, I'm pretty well-versed in, compiling Clojure ahead of time into a static binary frequently creates larger binaries, in the 10s of MBs. Looking at binaries installed on my computer, this is on the large size, but not extraordinary. If Free Pascal users are complaining about much smaller binaries, like the wiki article suggests, they're likely either working in specialized situations or not being realistic.
As I've been working on this and dropping in the titles, I realized how clear and appealing they are. Programming in the 21st Century may sound like a throwaway title until you realize that when he started to program, the 21st century was legitimately far away. With the pace of technological development we are in a different era. (You could probably make the case for several eras passing between then and now, in fact.)
Overall, there's a focus on the end product. While I enjoy retrocomputing, programming language theory, and a bunch of other technology rabbitholes, I do want to make things that are creative, useful, or even both. I see Programming in the 21st Century as a useful reminder to do just that.
The blog finished up in 2017, with Hague saying he said all he meant to. While I'm sad to see it wrap up, it ended on a high note. I wish Hague success in whatever games or programs or non-tech projects he's doing now.
I'm not talking about specific claims either, like "garbage collection causes too much latency for demanding games" even if I don't fully agree with them. I'm talking about claims that garbage collection is generally a sign of laziness or why programs today are so slow.↩
I'm not just making this comparison because I've been using Turbo Pascal, it actually comes up in the wiki.↩