More KSR on anti-anti-utopianism, this time at Commune Magazine:
Clearly we enter here the realm of the ideological; but we’ve been there all along. Althusser’s definition of ideology, which defines it as the imaginary relationship to our real conditions of existence, is very useful here, as everywhere. We all have ideologies, they are a necessary part of cognition, we would be disabled without them. So the question becomes, which ideology? People choose, even if they do not choose under conditions of their own making. Here, remembering that science too is an ideology, I would suggest that science is the strongest ideology for estimating what’s physically possible to do or not do. Science is AI, so to speak, in that the vast artificial intelligence that is science knows more than any individual can know—Marx called this distributed knowing “the general intellect”—and it continually reiterates and refines what it asserts, in an ongoing recursive project of self-improvement.
That’s the dovetail I didn’t know I was looking for that connects to this recent NYT longread on Oncle Latour:
Crowded into the little concrete room, we were seeing gravity as Latour had always seen it — not as the thing in itself, nor as a mental representation, but as scientific technology allowed us to see it. This, in Latour’s view, was the only way it could be seen. Gravity, he has argued time and again, was created and made visible by the labor and expertise of scientists, the government funding that paid for their education, the electricity that powered up the sluggish computer, the truck that transported the gravimeter to the mountaintop, the geophysicists who translated its readings into calculations and legible diagrams, and so on. Without this network, the invisible waves would remain lost to our senses.
I’ve spent more time than I’d like to admit hanging around the online communities of the kind of people we are worried about reaching here, and I am here to tell you: They are using their critical thinking skills.
They are fully literate in concepts like bias and in the importance of interrogating sources. They believe very much in the power of persuasion and the dangers in propaganda and a great many of them believe that we are the ones who have been behaving uncritically and who have been duped. They think that we are the unbelieving victims of fraud.
Which is not to set up some kind of false equivalency between sides. But I do want us to consider the possibility that we don’t need to talk across that barrier, and that it might not be possible to talk across it. That we need to consider that if it’s true that vast swaths of the voting populace are unbelieving victims of fraud, that there’s not much we can do for them. That we may need instead to work to invigorate our allies, discourage our enemies, and save the persuasion for people right on the edge.
But, again, I’m saying all of this to you as someone who has not figured this out.
This whole fake news phenomenon is hugely important and historically significant. At the moment I’m completely captivated by the strength of an analogy between the Gutenberg era and the internet era, this rhythmic force coming out of the connection between them. Radical reality destruction went on with the emergence of [the] printing press. In Europe this self-propelling process began, and the consensus system of reality description, the attribution of authorities, criteria for any kind of philosophical or ontological statements, were all thrown into chaos. Massive processes of disorder followed that were eventually kind of settled in this new framework, which had to acknowledge a greater degree of pluralism than had previously existed. I think we’re in the same kind of early stage of a process of absolute shattering ontological chaos that has come from the fact that the epistemological authorities have been blasted apart by the internet. Whether it’s the university system, the media, financial authorities, the publishing industry, all the basic gatekeepers and crediting agencies and systems that have maintained the epistemological hierarchies of the modern world are just coming to pieces at a speed that no one had imagined was possible. The near-term, near-future consequences are bound to be messy and unpredictable and perhaps inevitably horrible in various ways. It is a threshold phenomenon. The notion that there is a return to the previous regime of ontological stabilization seems utterly deluded. There’s an escape that’s strictly analogous to the way in which modernity escaped the ancien régime.
My tendency is not to draw a huge distinction between [scientists and artists]. In all cases one’s dealing with the formulation or floatation of certain hypothesis. I am assuming that every scientist has an implicit science fiction. We all have a default of what we think the world is going to be in five years time, even if it’s blurry or not very explicit. If we haven’t tried to do science fiction, it probably means we have a damagingly conservative, inert, unrealistic implicit future scenario. In most cases a scientist is just a bad science fiction writer and an artist, hopefully, is a better one. There is, obviously, a lot of nonlinear dynamism, in that science fiction writers learned masses from scientists, how to hone their scenarios better, and also the other way around. Science fiction has shaped the sense of the future so much that everyone has that as background noise. The best version of the near future you have has been adopted from some science fiction writer. It has to be that science is to some extent guided by this. Science fiction provides its testing ground.
Science fiction, science fact, and all that's in between …