Category Archives: Social Theory

epistemic seismology

The always insightful Ryan Oakley wrestles with reality:

Some stories may seem more true than others, some more pleasing, and others more dangerous, but no matter how true, beautiful or deadly, they are stories. Our reality is woven from stories –tales invented by readers just as much as authors– and our personalities are only stories we knit into a theme and give a name. There may well be some some hard base of facts at the bottom of all these lies but, if we ever find it, it will be through stories.

Right now, though, I feel like the theme is in tatters. On a societal as well as a personal level, it feels like the stories are moving at cross purposes completely disconnected from each other and any base of facts. Reality itself feels strange, stretched, and strained. The contradictions are immense. Reality has a web of stress fractures. The cracks in order may be how the light gets in but it’s also how the weird leaks into the world. At night, in the dark, the light itself is weird. Blindingly odd. The full moon stands out against black space.

[…]

I can understand why some people just throw their hands up, declare the whole thing some sort of nefarious prank, and say “fuck it.” I don’t agree with them at all, and it’s bleakly hilarious to hear people who compare free medicine to the Holocaust claim other people are under mind control from Big Pharma. (Like, friend, if you think cheap medicine or anything that might even slightly reduce the grotesque profit margins of the medical industry can be compared to genocide, maybe, just maybe, you’re the one under the fucking spell.) But I can understand the urge to scream FAKE. It would simplify matters.

Oakley notes at the top of that post that we who write are perhaps better equipped to understand reality as a braid of stories, and he quotes postmodernity’s prophet Philip K Dick to illustrate the point. I’m inclined to agree, not least because I was saying something similar to myself in my morning pages today… but at the same time, I am obliged by conditioned instinct to factor in the bias inherent in that way of understanding things. It’s a bit like the old everything-looks-like-a-nail problem: if you have trained yourself to see the world as stories, then story will be your first port of call to provide a model for any given phenomenon. Same reason economists see everything as a case of supply and demand, or evangelists see everything as the struggle between the cartoonishly-drawn forces of Good and Evil… and it bears noting that we are all conditioned by multiple narrative templates at once. (Hell knows there’s a big ol’ fandom a-squat on the intersection of neoclassical economics and fundamentalist Xtian evangelism.)

Of course, as a trained social scientist with an (un)healthy amount of theoretical reading under my belt, I’m also conditioned to be reflective about my own thinking, and hence to look at it and think “yep, I’m probably being shaped by structural forces here”. Though sometimes that mostly feels like I’m cut off from the comfort of simplistic conspiracy-centric thinking that Oakley mentions in the excerpt above. I certainly have nowadays a much more visceral appreciation for how things like the witch trails managed to roll on for so long; the hunger for a tangible enemy or scapegoat is a real thing.

Anyway, mostly clipping this because it feels like a really good example of someone describing the experience of living through an epistemic collapse without using five-dollar academic terms like “epistemic collapse”. Nonetheless, I’m pretty sure that’s what’s going on, here. The pandemic, as a sociopolitical phenomenon as well as an epidemiological one, torpedoed a major bulkhead in the structure of the hegemonic narrative (which we might label with various other five-dollar terms, e.g. capitalist realism, neoliberalism, late-capitalist modernity, take your pick); a lot of implicit promises about life as a middle-class person in a ‘developed’ country have been shown to be contingent at best, and outright fictional at worst. Perhaps the most terrifying has been the implicit promise that “no one dies before their time any more”—which, to be clear, was always already a fiction, the belief in which could only be sustained by limiting your view of the situation to people in the same situation as you. People die “before their time” all the time; you just don’t hear about it, because they’re not on your social radar.

What’s interesting about this situation—and I mean interesting here in an admittedly somewhat ghoulishly academic-on-the-bridge-of-the-Titanic sense, mixed in with that writerly sense that Oakley describes, in as much as they can be considered two different senses at all—is that while there have been epistemic tremors before, they’ve tended to be corrected by the positive feedback mechanisms of networked capital. (Even capitalism’s supporters would likely concede that it’s something of an autopoietic system, even if they wouldn’t use that term, Adam Smith’s much misparsed invisible hand is a pre-systems-theory attempt to describe the autopoiesis of a market system.) But as smarter and better-read people than me are pointing out, that system is currently quaking itself apart in its attempts to accommodate this sudden new perturbation… the contradictions are hanging out for all to see, like a drunk economist’s junk at the faculty Xmas party.

Even if the economic system can be restabilised, the sociopolitical contortions and contradictions necessary to achieving that end have tipped over and scrambled the prevailing logics of the hegemonic plot. To return to writerly metaphors: the collectively-written-in-realtime novel in which we have been living has strained our suspension of disbelief beyond breaking point. Seen from that perspective, it’s perhaps no wonder that everyone is reaching for their standard-issue epistemic author(ity)-figure fallback as something to have faith in.

(As a parting aside, returning to the rest of Oakley’s post: they sure as shit look like cats to me. But that’s the point, I think; interpretation is always a matter of context, and telling people that they’re wrong in a period of epistemic collapse is counterproductive. In a way, it doesn’t matter what those critters in his photo “really” are. They presented as inexplicable, and that emotional truth will never be conquered by any amount of factual argument that refuses to engage with the affect of the experience. There’s a lesson here for all of us, I think, though it comes at a time when it is least easy to absorb.)

honeymoon objectivity

Serendipity, thy name is INTERNET. Currently in the midst of working up a big old grant application*, and what should appear but this piece from Sun-Ha Hong at Real Life, neatly filling a reference gap that’s been bugging me for a few weeks? Preach it, brother:

Fredric Jameson once wrote that science fiction has become not a place for encountering utopia, but a testament to “our incapacity to imagine the future,” and to the structural limits placed on our political imagination. Today, product demonstrations are as much an example of science fiction as any other popular entertainment. Successive generations of recombined slogans and wondrous objects help recirculate the same old futures, pulling us back to a world of suits in cubicles and aprons in kitchens, evoking that soothing mid-century dream in which we were supposedly modern, and nothing really fundamental needed to change about society.

How different, really, is the latest generation of unlikely promises? Artificial intelligence, now inflated to describe a wide variety of systems that are neither artificial nor intelligent, provides recycled fantasies of instant consumption and self-driving cars that reprise the dream of convenience as freedom. AI also forms Big Tech’s route to maintaining and strengthening its supply of military funding by reviving Cold War narratives of a technological arms race. The constant death and rebirth of words and things masks the closure of the future: If bitcoin is starting to feel old and tired, then why not NFTs? If Second Life or Google Glass didn’t cut it the first time, why not the metaverse?

In my book, Technologies of Speculation, I call this honeymoon objectivity: the incitement to fall in love with each new technology just as we break it off with the previous one, maintaining a stagnant cycle in which the next great invention, the next transgressive genius, again promises to deliver a utopia of frictionlessness and objective certainty. But this recycling of technofutures is fundamentally a conservative force, in which a highly limited selection of technical benchmarks, use-cases, and social relations are dressed up over and over again, with no thought to whether they’re worth preserving, or what could be built in their place. As Jameson hinted, to be transfixed by the future is to be paralyzed by it.

Shazam. Perfect.

[ * Said grant application is merely one of a half-dozen parallel writing projects currently ongoing, most of which are directly day-job relevant… which is one reason why it’s been a bit quiet here at the blog-homestead, for those who were wondering. It’s good to finally have some intellectual mobility, after the pandemic-isolation/broken-foot combo knocked me out for a long while… but damned if I’m not back on my bullshit, taking on too many tasks at a time. Still, it’s gotten me this far, hasn’t it? To misappropriate Deleuze’s twist on Spinoza, we still don’t fully know what this body can do… ]

Solnit’s hope vs. Arendt’s natality

Rebecca Solnit’s definition of hope is so succinct a summary of my own definition that I assume I must have picked it up from her (and from others who got it from the same source). This version is from a new interview at LARB, which I’m stashing here so I can cite it properly going forward:

I never describe myself as an optimist. An optimist is someone who thinks things will be all right no matter what. It is the flip side of being a pessimist, which means thinking everything will be bad no matter what. What I am is hopeful. Being hopeful means there are possibilities, but it is up to us to seize them and make something of them. We will see.

Interesting to compare this to Samantha Rose Hill’s reading of Hannah Arendt’s definition of hope:

It was holding on to hope, Arendt argued, that rendered so many helpless. It was hope that destroyed humanity by turning people away from the world in front of them. It was hope that prevented people from acting courageously in dark times.

Now, I’m not about to gainsay Hannah Arendt, nor Rose Hill’s reading thereof—but nonetheless it appears that Arendt is using the term in a very different way to Solnit: Arendt’s hope is much more like Solnit’s optimism, or so it seems to me. (It would be interesting to do a proper philological dig into the etymology of hope, and its different expression in the various Germanic languages.) That leaves Arendt’s natality as a plausible counterpart to Solnit’s hope:

An uncommon word, and certainly more feminine and clunkier-sounding than hope, natality possesses the ability to save humanity. Whereas hope is a passive desire for some future outcome, the faculty of action is ontologically rooted in the fact of natality. Breaking with the tradition of Western political thought, which centred death and mortality from Plato’s Republic through to Heidegger’s Being and Time (1927), Arendt turns towards new beginnings, not to make any metaphysical argument about the nature of being, but in order to save the principle of humanity itself. Natality is the condition for continued human existence, it is the miracle of birth, it is the new beginning inherent in each birth that makes action possible, it is spontaneous and it is unpredictable. Natality means we always have the ability to break with the current situation and begin something new. But what that is cannot be said.

(In the spirit of honesty, I must confess to finding something unsettling about the connection of futurity to “the miracle of birth”; perhaps this is an expression of an institutionalised misogyny on my part? I both hope and believe that it is not… but if it were, then by definition I would believe it to be something else, I guess. Which is another unsettling thought… and perhaps the more pertinent of the two unsettlements for me to address.

But the idea that “the children are our future” has always seemed to me—a childless person by personal choice, rather than by political conviction—as a way to kick the can of change down the road, even if not intentionally or consciously: “well, we’ve made a mess of things, but if we bring the kids up OK, they can sort it all out when we’re in our dotage!” And I guess that, as a recent exile from Rainy Reactionary Island, I currently find it rather hard to believe that generations in their dotage will actually accept their children trying to change anything at all while they’re still alive.

Which is not, to be clear, to claim that there’s some inevitable conservatism inherent in parenthood… though it is perhaps to suggest—as I believe many feminist and post-feminist theorists have already done at great length—that the nuclear family is the institution that does the majority of the cellular-level work of reproducing capitalist relations. I dunno… this is one of the may fields where I need to do a lot more reading than I already have.)

the most animal of all human abilities

It is perfectly possible, then, that far from being an exclusively human attribute, the narrative faculty is the most animal of human abilities, a product of one of the traits that humans indisputably share with animals and many other beings—attachments to place. Perhaps, then, storytelling, far from setting humans apart from animals, is actually the most important residue of our formerly wild selves. This would explain why stories, above all, are quintessentially the domain of human imaginative life in which nonhumans had voices, and where nonhuman agency was fully recognized and even celebrated. To make this leap may be difficult in other, more prosaic domains of thought, but it was by no means a stretch in the world of storytelling, where anything is possible.

The shrinking of the possibilities of this domain, and the consequent erasure of nonhuman voices from “serious” literature, has played no small part in creating that blindness to other beings that is so marked a feature of official modernity. It follows, then, that if those nonhuman voices are to be restored to their proper place, then it must be, in the first instance, through the medium of stories.

feral empiricism

Not sure if the title term is Mark Carrigan’s own coining, here—it seems to be a g**glewhack, so maybe it is?—but I like it enough that I’m stashing it here, with some of the material for context:

The phrase “do your research!” is ubiquitous across the subcultures which have popped up amidst platform capitalism’s epistemic chaos. […]

It often goes hand-in-hand with a feral empiricism in which the mediated constructed of reality is rejected out of hand in favour of what can be seen with one’s own eyes. This treats tweets, posts, memes and videos as something like sense data which are encountered without the gatekeeping of media elites, unless liberal big tech firms start to censor the free flow of this data because they don’t want people to know the truth!

Once you accept that, as Will Davies once put it, you’re not going to convert people who have gone flat earth who believe in QAnon by ‘hurling facts’ at them, it becomes crucial to understand these subcultures as cultural forms, which have emerged in the chaotic environment which social media platforms have generated…

What’s particularly interesting to me is the extent to which the circumstances of the pandemic have brought out what I feel to be a related form of feral empiricism in people whose job description should include a double rejection of it. Seeing academics from philosophy-of-science and STS backgrounds trotting out rather more wordy versions of “follow the science!”—as if there were a single, static and settled science to follow—has made for a few deeply difficult conversations, which I have for the most part totally avoided having.

Now, this is different thing to the feral empiricism Carrigan’s pointing at here, certainly—but it’s surely related (and definitely subcultural, inasmuch as you may be willing to accept epistemic communities in academia as subcultures, which as a veteran of many subcultures prior to arriving in academia, seems to me an astonishingly easy thing to accept). I expect the relation comes from the shared experience of a desperate need for a simple explanation of (and guidance for a response to) a confusing and scary perpetually-evolving situation.

And when you add in the widespread fallback to what we might call a sort of vulgar interpretation of the information deficit model of science communications—which, to caricature more than a bit, boils down to “maybe they’re resisting because we’re not yet sufficiently hectoring and shaming their ignorance?”—well, I think there’s probably some sort of case to be made for the recourse to comforting articles of faith (secular or otherwise) in times of crisis.

But that opens another question: is this analysis just my own version of a comforting article of secular faith?