It’s yer man Dave Karpf again, with a great illustration of an otherwise hard-to-explain historical vibe-shift:
I think one reason why people in my age bracket have such strong, implicit faith in Moore’s Law is that it was part of our shared reality for such a long time. Consumer tech really was getting significantly better and significantly cheaper, at a pace that you could not help but notice.
I saved up all summer in 1998 to buy a nice stereo. (It played tapes AND cds!) Four years later, my friend Becca was showing off the clickwheel on her new iPod.
In 2021, the keyboard on my laptop started having trouble. The “e” key stopped working. I checked, and found the computer wasn’t under warranty anymore. Turns out I had bought it way back in 2012. It still worked fine, except for the damn “e” key.
When I started college in 1997, a nine year old computer (from 1988!) would, for all practical purposes, not be a computer at all.
To be fair, that 1988-vintage computer would probably work just fine if you powered it up in 1997; Karpf’s point is that what you could actually do with it would have seemed incredibly crude.
(I started inheriting my old man’s obsolete PCs circa 1990, so I’m very familiar with the machines of that era and their limitations… and likewise very familiar with the sense of hyperfast generational turn-over in computer hardware through the Nineties, due to working with the forerunners of what we now think of as DAWs.)
One might push back on Karpf here and say that a 2012 PC would likewise seem archaic in its capabilities, but I—and, I assume, Karpf—would counter that, unless you were doing something that required a lot of graphics juice (i.e. video editing, top-line games), a 2012 machine would actually be eminently capable of doing pretty much everything you needed, in terms of essential function. Why then have computers become so much more powerful and capacious?
Firstly, because operating systems (particularly, but not exclusively, Windows) have become bloated to the point of being morbidly obese, consuming a huge slice of the resources available through the introduction of unnecessary gimmicks.
Secondly, you’re most likely to notice slowness in your web browser, which is partly because web browsers have in effect become an OS inside your OS (yo dawg, &c &c), partly because modern websites do a whole bunch of stuff client-side (because why pay for more capacity in your hosting when you can let the visitor’s machine take the strain?), and partly because the rendering of ads and trackers and all that jazz (and/or the running of countermeasures to exclude ads and trackers and all that jazz) has also escalated.
What you want a computer to do hasn’t changed. What’s changed is the amount of treacle that OS monopolies and website delivery frameworks pour into your computer’s gearbox.
I mostly avoid being This Person, because why open yourself to the risk of being forced into a stereotype and pilloried for it?
But just this once:
In this I would like to respectfully disagree with Karl Schroeder. He’s not wrong that a *nix OS will sometimes fail in ways that are incredibly hard to deal with, even if you’re (as both I and Karl surely are) reasonably tech-literate. But honestly, given the incredible and very regular difficulties that the average works-in-an-office not-tech-literate user can have with a fully upgraded version Windows running on a machine more than two years old, I think we should stop using that excuse1.
Encourage more Linux installs! Enable the creation of tech-support jobs which are disconnected from the sham of Microsoft’s cash-cow certification programmes! Enable a massive increase in viable lifespan of computer hardware! Kick one of the big monopolies of the early C21st firmly in the bollocks!
Moore’s Law was always a hand-wave, but as Karpf notes, there was a feeling there during a certain period of time which meant that people smart enough to know better were willing to ignore the plot-holes; it was a good story, an exciting story, a story that justified a whole lot of things, and so finding a way to extend the franchise was easy. And sure, we got faster and more powerful computers—which we’re now using to serve/squelch ads on bloated websites that only a developer could love, and to crowbar the direct descendants of Clippy into software that would benefit far more from having features removed rather than features being added.
The supposed exponential of technological acceleration was always really just a succession of S-curves, in which the almost-exponential middle phase of the next new thing becomes the new metric just as the asymptotic end phase of the last new thing starts crapping out.
Moore’s Law is thus a synecdoche of that whole transhumanoid/Singularitarian notion of accelerating technological change. It is, to use a term from science fiction, a fix-up: a set of smaller stories stapled together crudely in order to look like a full novel. Readers who are hungry enough for a long-form work with the generic vibes they already love will happily overlook the shoddy welds and bodged joins.
Or, if you want a much cruder metaphor: people will do almost anything to convince themselves the party is still fun until the cocaine runs out.
Well, the cocaine has run out.
“The Future” is dead, and we all know it. But because we’ve been successfully convinced that we have to wait for professionals to provide us with The New Future, rather than making our own, we’re stuck in something like an endless generative loop based on A Weekend at Bernie’s.
- If you’re sat there feeling smug about having opted for Apple’s ecosystem instead of Microsoft’s, well, enjoy it while it lasts, because it won’t. ↩︎
Leave a Reply