Category Archives: Technology

Smaller, better, faster, more!

(not) giving it the progressive legitimacy it would lack otherwise

One of the joys of having unplugged from the birdsite again is being able to largely ignore the whole crypto/Web3/NFT circus, at least in its most immediate expression. Of course, various people are writing about it more slowly, and it’s probably a function of my pre-existing biases that have ensured the vast majority of what I’ve read tends to cash out as academic or practitioner-accented versions of NOPE NOPE NOPE. Without any shame for the hipsterness of the statement, I’ll note that I was skeptical of this stuff when it was still new (and I have the receipts to prove it).

But when people I respect contradict or challenge me, well, I do my best to listen. Here’s yer man Matt Colquhoun:

The world is changing, both on- and offline, but our imaginations are slow to catch up. Without an insistence upon it proceeding otherwise, Web3 will be (and is being) used to replicate the pre-existing cultural hegemony of Funko-Populist finance bros.

Let’s just stop to do a full on gatsby.gif at that lovely coining in the last line, there. Chapeau, sir.

Now, Matt seems to me to be saying that he’s worried that by NOPEing out of this space entirely, we’re giving up the chance to seize the potentially good bits of this assemblage. On that point, I agree. But my instinct—and I will gladly concede that it is very much an instinct, one nurtured by the intense disillusionment of the Nougties blogging goldrush (of which I was arguably one of the people who did moderately well, albeit in a very drawn-out and roundabout sort of way), but also from, ah, let’s just say an earlier stage of life during which I was exposed to an awful lot of hucksterism and hustle of an even more naked sort—my instinct, leavened with a bit of research (though not so much as an advocate would insist was a precondition of having an opinion), says to me that there’s nothing there fight for, or if there is, the triumph of the very worst potentials thereof—already very much in the ascendant—is effectively baked in due to its unfolding within the inducement structure of capitalism more broadly.

None other than Evgeny Morozov sees it as being worse still: by looking for the bright side of this mess, we end up giving it a veneer of progressive respectability:

How does one criticize a flawed, unrealistic, and extremely partial narrative that is, nonetheless, being rapidly turned into reality? This is not a problem that one can solve by adopting a more pragmatic, solutions-oriented attitude that many of the proponents of Web3 demand from their critics. The goal here cannot just be to find a more progressive use for DAOs or tokens or NFTs. I’m sure they exist – and many more of them can be found in due time. But what is the point of such search expeditions, when, in the end, such efforts are only likely to help in the left-washing of the Web3 brand, giving it the progressive legitimacy it would lack otherwise?

As he puts it, “there is no ‘there’ there”; the self-referentiality of the whole edifice means anything you do to fight it just gets hoovered up by the rhetorical cyclone.

But back to Matt:

But there are a number of alternative visions out there — the latest issue of Spike Art magazine contains advocates for a bunch of them, who are both optimistic and pessimism about the current state of things. The worry I have, and that many others have, is that it may already be too late. What depresses me isn’t so much how NFTs are being used by the internet’s most naïve denizens, but that their idiocy atrophies the political imagination of the rest of us.

In that sense, the responsibility for our unabating digital dystopia lies as much with the mindless naysayers as it does mindless enthusiasts. The narcosis of an old digital radicalism is developing necrosis. Something has got to give, but we need to realize that this needn’t be the communities we hold dear in themselves. There is space for them to well and truly thrive, if we demand and carve out that space, just as we did when the internet first became available to us.

Now, I have a lot of time for Matt’s negation-of-the-negation argument, to the point that I have once phrasing of it blu-tacked to the wall above my desk. Maybe it’s just a function of me being An Old nowadays, but I think the reason for the necrosis of digital radicalism is the acceleration of the capture process with each new iteration of the digital frontier… plus, perhaps, a dawning realisation that perpetually turning to the next frontier is a foundational plank of the thing we’re trying to fight against.

To reiterate a point from a few days back, this ain’t me going all primitivist and suggesting “we can do without technology”; far from it. But I think I do perhaps feel that getting away from this attitude where the technological is often or always the site—a non-spatial site, which is perhaps another root of the problem—of the next potential victory. I try not to cite ol’ Grandpa Karl too often, as I don’t think I’ve read enough of him, but I’m pretty sure that his basic argument was that while technology might serve to enable a more socialist world, it could only do so once the political economy in which it operated had been reconfigured. Seize the means of production first, right? Then reorganise the uses it’s put to. So wading in to the Web3 shitstorm to me feels like trying to fine-tune (post-)Fordism for socialist ends: totally well-intended, but ultimately of use only to the factory owners.

Matt doesn’t want “the communities we hold dear” to be sacrificed to to the necessity of change, and yeah, I hear that. I guess I’m just not so convinced as I once was—and those who’ve known me long enough will know that I was super convinced, a bona fide Web2.0 evangelist—that a change of medium to the next new thing is going to keep those communities vital. To be honest, I think making better, slower use of the superseded media might be a better place to start. The Arab Spring didn’t fail because social media wasn’t sufficiently advanced or decentralised; it failed because the systems of power it was arraigned against were too deeply entrenched, and those media were in turn embedded into those structures from the get-go.

Eh, I dunno—like I say, I’m An Old now, and increasingly identifying with the (historical, rather than vernacular) label of Luddite. Sure, the Web3 powerloom might revolutionise many of the things I do for a living… but even if the nice guys work out a way to do that, is it going to compete with the monkey-jpeg people and Andreessen-Horowitz? Not bloody likely, mate. I only have so much fight left in me, and I’m not wasting it in a space where the signal-to-noise ratio (not to mention the VC bankroll) is that high.

Still, good luck to anyone who wants to brave it. Because I agree with Matt’s parting line, as well:

It is our complacency, not Web3, that will be the death of us.

And yeah, maybe I’m just NOPEing out of the definitional struggle of our times… but I can’t see what work there is to be done there, let alone how to start doing it. Perhaps I just don’t have enough of a stake in it? Perhaps the (veeeerrrry relative) security of early career academia has seduced me away from the vanguard? Quite possible.

But I very clearly remember believing that having my own website and socnet handles would lift me out of the neoliberal precariat, and I remember seeing that—even as it did so for a very lucky few of us—it made things even worse for those who missed the bus. My sense that Web3 &c. will be an even crueller and faster clusterfuck goldrush is, as I say above, predominantly instinctual—which is perhaps to say imaginative.

I can’t imagine a metaverse in which things are better for most people. But I can imagine a world in which we’ve decided that chasing our emancipation down the fibre-optic backbones and into the data-centers will look, in hindsight, like a very weird thing people once believed, like the indulgences that came off the early printing presses. Progress is the greatest lie ever told, and Web3 looks like the very shiniest empty box it has ever been put in.

Good luck in there, but count me out.

dispensable and scarce

Just a quick subtweetish sort of blog post, here, to note that the more times people start an essay or article or academic paper or blog post with a phrase along the lines of “[d]igital platforms and the online services that they provide have become an indispensable and ubiquitous part of modern lifestyles, mediating our jobs, hobbies, patterns of consumption and forms of communication“, the more reified becomes the supposed indispensability and ubiquity that is supposedly being critiqued.

Just fucking stop it, OK? Farcebork is not indispensable; Scamazon is not (yet) ubiquitous. But conceding in your opening line that “well, they probably will be, so maybe some gestures toward regulation (in a system where regulatory capture is a significant part of the problem) would be good, please, sir?” is to have thrown in the towel before you even step into the ring.

Every time I hear someone talk about the indispensability of Farcebook in particular, I think of junkies queuing for their methadone: “I wanna quit, I really do, but it’s too hard, all my friends are still using it”. Well, if that’s really the case, stop sitting around and fantasising about regulating your dealer; this idle chatter of ressentiment while you wait for Your Man is a big part of his hold over you.

(For the avoidance of doubt, this is not to to make the equally fallacious argument that “we can do without technology!”—though if anyone in this audience is still making that sort of argument, I’m not gonna waste my time explaining why its both stupid and hypocritical.)

(For the further avoidance of doubt, I’m generally sympathetic to the work of that paper’s lead author, and indeed to most of what the paper actually argues. But that doesn’t change my fury at the implicit capitulation of that opening line. Language shapes social reality; the more you describe the worst parts of our social reality as inescapable, the harder it becomes to escape them.)

the subject has been usurped

Lots of chewy stuff in this M L Sauter joint, jumping off from the seeming climb-down of G**gle’s Sidewalk Labs project in Toronto—which, as Sauter notes, was less of a stoppage than a sort of metastasis, with the ideological cancer scattering away from the site of the obvious tumour—in order to talk about surveillance and image recognition through Sontag’s theories of photography.

It’s all good stuff, though mostly too far outside of my own wheelhouse for me to comment on beyond commending it as good. But I wanted to clip this paragraph because, as someone who developed a hard-to-articulate but very real loathing of photography at a young age, and in particular of being photographed, it really expresses something that I’ve never managed to explain adequately to myself or others:

A photograph constitutes a violation, and capture, of its subject, “by seeing them as they never see themselves, by having knowledge of them they can never have; it turns people into objects that can be symbolically possessed.” The accuracy of the knowledge photography provides in this case is irrelevant. In Sontag’s analysis, the subject of the photograph has been usurped from their creative place at the center of their own story. Their own subjectivity, their ownership of themselves, has been dented by the satisfied knowledge-seeking of the photographer and the photograph. The photograph is not simply a document. It contains the psychological impact of surveillance itself. Data collection and production, through cameras and other sensors, similarly is not simply an information bit. It is an artifact of surveillance, of usurpation. It is the testimony on the subject that supersedes the testimony of its subject. As reality is judged against photographs, so is reality judged against, and expected to accord with, data.

The first half, in particular, really captures the sense I had as a child of somehow being trapped in amber by photographs taken of me… though given this seems not to be much of an issue for a lot of folk around the same age, I’m going to assume it’s also tangled up with the psychosocial dynamics of my family and upbringing (which, without wanting to go all Tiny Violin about it, was very much an experience of lacking agency over my own story).

Then again, I’m not much of one for photographs of other people, either. I often get asked if I have pictures of my family, and while I’m pretty sure there’s one or two images of my sister and my mother somewhere in my digital archives, they’re not at all ready to hand… and I have no images of my father at all. This seems genuinely strange to most people—and perhaps it is, I don’t know. “How do you remember them?” With my mind’s eye, I suppose… which I fully understand to be extremely malleable and unreliable in its depiction of characters and narratives. Does a picture, or an album of pictures, provide anything more than a persistent icon behind which those memories might be filed? No idea. Maybe I should read that Sontag stuff.

a world where flesh and machine are in tension: re-reconsidering cyberpunk

Found myself nodding appreciatively at this re-reassessment of cyberpunk by Lincoln Michel:

Everyone has their own definitions of genres, but to me the essence of cyberpunk is not tied to the 1980s visual trappings that have defined it in video games and film. Cyberpunk isn’t merely neon signs or street toughs with high-tech leather jackets (or its problematic “Japan panic” legacy.) For me, the core of cyberpunk is first as science fiction that fundamentally recoils at the growing power of corporations and unchecked capitalism. That, as Fredric Jameson once said, cyberpunk is the “supreme literary expression…of late capitalism itself.” Secondly, that it is a genre that understands that technology is not clean. Technology is never implemented in smooth and even ways—it is always messy, always unequally accessed. Always (in our world) in service of power and systems.

“The street finds its own use for things,” yeah—the centrality (and deep truth) of that element often seems to be lost in a lot of the trashings of the genre that have been thrown around recently. Though by no means all of them: Doctorow’s take on k-punk as Luddite lit is typically idiosyncratic, and Madeline Ashby, with the eye of a novelist who is also a practising critical futurist, identifies the fundamental limitations of of the paleofutures—now four decades old—that underpin the genre as most commonly practised.

This is kind of Michel’s point, too, and he gets there by returning to the source and tracing the journey from the meat to the virtual:

Of course, thinking about the effects of technology on the human form is not new to cyberpunk. All genres have tendrils of influences and precedents that stretch back in time, but it seems fair to pick William Gibson’s seminal Neuromancer as ground zero. Gibson’s novel towers over the genre as surely as the Mount Doom of Tolkien rises above the realm of epic fantasy. And Gibson didn’t forget the body. From the first page of Neuromancer we are in a world where flesh and machine are in tension. We begin in a crowded bar filled with addicts and a bartender with a “prosthetic arm jerking monotonously…his teeth a webwork of East European steel and brown decay.” Our hero, Case, is suffering pain from his damaged nervous system. He has fallen “into the prison of his own flesh” without being able to access the matrix of cyberspace. Cyberspace is how Case escapes from the world of flesh. The meatspace.

Other ’80s works like Akira, Tetsuo: The Iron Man, and Donna Haraway’s classic “A Cyborg Manifesto” were even more concerned with the mingling of the human form with technology. But by the 1990s it seems the genre—in the US and UK at least—focused ever more on the virtual realm, often in a giddy way. In the ’90s, the web was the “information superhighway” where anyone could be what they wanted unrelated to the real world. Later cyberpunk novels like Neal Stephenson’s satirical Snow Crash built on this idea of escaping into cyberspace, imagining a cyberspace that is a fantasy video game world. Escapism within escapism. Even the virtual representations of bodies were incorporeal.

Michel’s own approach is to look not at information technology, which he has consciously made banal in his own work, and to focus instead on biotechnology, with the human body once again the focus of the struggle between street and boardroom. But as he notes, cyberpunk, in its slow recuperation as a nostalgic aesthetic, went on to fetishize that which it once critiqued, and to abandon embodiment as the site of struggle.

Cyberpunk is typically thought of as a dystopian genre. But what had begun as a cautionary tale became a celebration. Isn’t all of this really damn cool? Wouldn’t you like nothing more than to be a hacker god swinging swords and dodging bullets free from your corporeal form?! As cyberpunk went further down this path, the body disappeared more and more. At the same time, the fundamental critique seemed to evaporate. Dystopian elements were still tacked on, but in the background like neon holograms. For visual style, not warning. Meanwhile real-world dystopian tech companies and right-wing movements felt free to pluck cyberpunk language (“red pill,” “metaverse,” etc.) for themselves. The end of this cyberpunk path is Ernest Cline’s Ready Player One, where the most exciting thing in the universe is to play a video game populated with corporate trademarks.

Top marks for summing up succinctly why even just plot summaries of Cline’s work have seemed nauseatingly unappealing to me.

I would note, though, that—for all the other flaws they might be argued to have—Gibson’s most recent novels are still very much focussed on embodiment as the site of struggle; The Peripheral is totally about that conflict, and the way it is shaped by political, financial and even temporal power.. Furthermore, as I have argued elsewhere, Gibson recognised the merging of cyber- and meat-space quite early; it’s arguably the central idea of the Bigend trilogy, in which one character notes in passing that (and I paraphrase) “cyberspace has everted”.

Of course, as Michel says at the start, everyone has their own definition of any given genre, and it may be that the discourse around cyberpunk will get stuck on the co-opted late-phase fantasy-virtuality definition for years to come. Mostly it’s reassuring to see someone else respond to that definitional shift in much the same way I have in the last decade or so.

But as someone from the heavy rock side of the musical divide—and there’s a generic cluster that is very much in the cultural doldrums right now, reflexively associated with rockism, Boomers, and the worst-case Durstian cliches of the nu-metal era, even as the former underdog of hip-hop shambles ever further into its own bloated and drug-addled stadium-show hegemony—I’m aware that genres never die, they just shrink to a point where those who’ve always seen something to love in them can go off and make something new out of that discarded form or style.

Michel mentions Maughan and Newitz as both having found ways to return to the political (rather than simply aesthetic) heart of cyberpunk, and his own novel sounds interesting enough that I’ll be ordering myself a copy. The flame is still alight.

epistemic humility vs. “the engineer’s disease”

This post is prompted in part by a post by Cennydd Bowles, in which he riff on Nathan Ballantyne’s notion of epistemic trespass. Reading it reminded me of a term I’ve seen frequently, most often on MetaFilter, where it has been part of the lexiconic furniture for some time. An ask-the-hive-mind entry on that site traces the notion of “the engineer’s disease” back to 2002, so it’s plausibly an internet-era coining—which, interestingly, means it’s a term critical of engineering which emerged during what might be seen as the peak of engineering’s cultural hegemony, and thus in hindsight a weak signal of sorts.

The engineer’s disease is related to (and perhaps subsumed by, if not subsuming of, the briefly better known solutionism); here’s a description from that MeFi page:

engineers and other technical folks assuming their technical knowledge of systems (usually computer, mechanical/electrical) gives them expertise in solving other complex issues

Very much a lay precursor to Ballantyne’s trespass, then, but the concept is much older than the term. As someone else in the MeFi thread points out, Vonnegut’s Player Piano (1952) is a satire of “Engineer’s Disease writ large”, as they illustrate with a pithy quote from the novel which immediately brought me back to the experience of reading it:

“If only it weren’t for the people, the goddamned people,” said Finnerty, “always getting tangled up in the machinery. If it weren’t for them, earth would be an engineer’s paradise.”

Bowles notes that epistemic trespass is ubiquitous in the public-intellectual sphere, and (rightly, I think) connects it to the capitalist imperatives that power the hot-take attention economy, albeit using kinder words than I have chosen. But he also notes its particular prevalence in the tech scene, broadly conceived, and his explanation for it seems plausible, particularly given his own identification with that scene:

Dabbling got many of us here in the first place, and a field in flux will always invent new topics and trends that need diverse perspectives. But by definition, trespass happens on someone else’s property; it’s common to see a sideways disciplinary leap that puts a well-known figure ahead of existing practitioners in the attention queue.

This is certainly inefficient: rather than spending years figuring out the field, you could learn it in months by reading the right material or being mentored by an expert. But many techies have a weird conflicted dissonance of claiming to hate inefficiency while insisting on solving any interesting problem from first principles. I think it’s an ingrained habit now, but if it’s restricted to purely technical domains I’m not overly worried.

Of course, it isn’t restricted to technical domains, so Bowles riffs on Ballantyne to recommend epistemic humility:

It’s easy to confuse knowledge and skills, or to assume one will naturally engender the other in time. Software engineers, for example, develop critical thinking skills which are certainly useful elsewhere, but simply applying critical thinking alone in new areas, without foundational domain knowledge, easily leads to flawed conclusions. ‘Fake it until you make it’ is almost always ethically suspect, but it’s doubly irresponsible outside your comfort zone and in dangerous lands.

No one wants gatekeeping, or to be pestered to stay in their lane, and there are always boundary questions that span multiple disciplines. But let’s approach these cases with humility, and stop seeing ourselves as the first brave explorers on any undiscovered shore.

We should recognise that while we may be able to offer something useful, we’re also flawed actors, hampered by our own lack of knowledge. Let’s build opinions like sandcastles, with curiosity but no great attachment, realising the central argument we missed may just act as the looming wave. This means putting the insight of others ahead of our own, and declining work – or better, referring it to others who can do it to a higher standard – while we seek out the partnerships or training we need to build our own knowledge and skills.

As Bowles notes, no one wants gatekeeping… but as one of the foundational notions in STS assures us, gatekeeping—or “boundary objects”—are exactly what allow us to even talk about “spheres” and “domains” of expertise in the first place. Which is perhaps to say that while no one thinks they want gatekeeping, we all do it, and it’s actually a vital part of how ideas move between domains; when it’s done right, those ideas retain a coherence and usefulness across their sites of use while allowing for more specific deployments at each site. When it’s done badly, well, you get what Bowles and Ballantyne and Vonnegut are talking about: the transposition of ideas as particularly parsed by engineers (or techies, or coders, or whoever else it might be) into spheres where those parsings are inappropriate, no matter how well-intended, and potentially destructive.

As with pretty much everything he writes, Bowles here aims to draw out the inherent good that he sees in the denizens of the tech domain, and I would note that it is this essential generosity of spirit on his part that makes him a writer I always read: to put it another way, while he thinks with domains, he does not think with them deterministically (in the way that the “engineer’s disease” label definitely does).

I believe (or at least I believe I believe) similarly that most people are basically decent and well-intentioned, but I suspect I take a more structural view than Bowles on the shaping of intention into action—which is to say, I suspect that the time for making reasonable pleas for the tech domain to wind its neck in a bit has long since passed. This is not due to any fundamental malice or recalcitrance on the part of engineers and techies, to be clear, but rather the extent to which that domain has achieved hegemonic levels of control and influence over economic and discursive systems.

Which, I suppose, could easily be parsed as a call for “more regulation”… and I suppose that, in a way, that’s exactly what it is. But it’s also a call for a reconceptualisation of what regulation means, as well as how it’s executed; regulatory capture is probably one of the biggest factors in the securing of that hegemony. Which means that this is really a call for a revaluation of our values around technology, with the proviso that for me the category of “technology” extends to things like regulation and governance as well as, y’know, gadgets and apps and such.

Indeed, perhaps that particular conceptualisation of technology, which—as many of my readers will already be aware, but, for the avoidance of doubt—is definitely not original or unique to me, is the keystone to the change I’m calling for. But it would have to be in turn a part of a broader renunciation of the implicit supremacisms of humanism itself, which in turn would need to recognise that such ideas can never be conquered, only dealt with dialectically.

These thoughts were brought to you by my having re-read a bunch of Latour over the holidays, and by my having binged my way through Claire North’s brilliant Notes from the Burning Age in the last two days; I’ll hopefully find the time to write up my thoughts on the latter in soon.