It’s not quite the same as the days when he’d routinely link to images of extreme body-mod people doing their thing, but Warren has once again managed to show me a thing I’ll never be able to unsee by linking to the Mr Beast production team manual, a deeply depressing document which is perhaps my new exemplar case for why we can’t have nice things.
Worth noting also, I think, that it pretty neatly dispels the philanthropic argument in favour of that guy’s work: as you can see from that document, if kicking puppies and shitting in baby-strollers was demonstrated to get a good rate of click-through and they thought they could get away with it, they’d just as happily do that. So fuck that guy and all who sail in him, frankly.
(That said, I did spend my time on the rowing machine this morning toying with the idea of some Riddley Walker-esque short story in which this memo has ended up as the sacred text of some ruin-dwelling remnant of humanity… but while I disagree strongly with the “too many dystopias” people, this gag would be too obvious and frankly unfunny to justify the effort, let alone the attempt to sell it. So perhaps I’ll just spend the weekend scouring the second-hand stores for a used dartboard on which to pin a print-out of his extraordinarily punchable face; it’ll scratch the same itch.)
And while I’m yelling at clouds, I might as well recruit Henry Farrell, whose most recent observations regarding the way in which large models gravitate toward the already popular, and thus amplify cultural conformity and banality, seem to me to be a very obvious statement of why we should be intensely distrustful of a technology which is, in essence, taking Mr Shithead’s business model and making it an all but autonomous turnkey service:
The plausible destination that LLMs conduct towards is not entropy, or at least not entropy any time soon, but a cluster of cultural strong attractors that increase conformity, and makes it much harder to find new directions and get them to stick. The more that we rely on AI in its current form, the more that human culture (including scientific culture) will converge on what is central.
Still, as much as the banality and conformity depress me, I am always brought back to the question of telos—of what the point of this generative crap is meant to be. Again, Mr Nihil’s memo holds the answer: it all boils down to “number go up”. Nothing else matters.
And now it seems the FAANGs—oop, sorry, my mistake; apparently we call them “hyperscalers” now?—are getting a boner for nuclear power as a way of keeping this stupidity running cheaply enough that their own number can also keep going up forever1, and I think it’s time for me to go outside and take a walk in the sun, because I’m starting to entertain the possibility that the decline of our species to a language-less tree-dwelling remnant, as per the final section of Steve Baxter’s Evolution, is a far better fate than we deserve.
- The headline here is a classic example of Betteridge’s Law in the wild. ↩︎
Leave a Reply