Tag Archives: epistemology

the conditions of credibility

Steven Shapin, with the — OK, with an STS perspective on “post-truth” at LARB:

The problem we confront is better described not as too little science in public culture but as too much. Given the absurdities and errors abroad in the land, it may seem crazy to say this, yet the point can be pressed. Consider, again, the climate change deniers, the anti-vaxxers, and the creationists. They’re wrong-headed of course, but, like the Moon-landing deniers and the Flat-Earthers, their rejection of Right Thinking is not delivered as anti-science. Instead, it comes garnished with the supposed facts, theories, approved methods, and postures of objectivity and disinterestedness associated with genuine science. Wrong-headedness often advertises its embrace of officially cherished scientific values — skepticism, disinterestedness, universalism, the distinction between secure facts and provisional theories — and frequently does so more vigorously than the science rejected. The deniers’ notion of science sometimes seems, so to speak, hyperscientific, more royalist than the king. And, if you want examples of hyperscientific tendencies in so-called pseudoscience, there are now sensitive studies of the biblical astronomy craze instigated in the 1950s by the psychiatrist Immanuel Velikovsky, or you can consider the meticulous methodological attentiveness of parapsychology, or you can reflect on why it might be that students of the human sciences are deluged with lessons on The Scientific Method while chemists and geologists are typically content with mastering just the various methods of their specialties. The Truth-Deniers find scientific facts and theories shamefully ignored by the elites; they embrace conceptions of a coherent, stable, and effective Scientific Method that the elites are said to violate; they insist on the necessity of radical scientific skepticism, universal replication, and openness to alternative views that the elites contravene. On those criteria, who’s really anti-scientific? Who are the real Truth-Deniers?

[…]

When science becomes so extensively bonded with power and profit, its conditions of credibility look more and more like those of the institutions in which it has been enfolded. Its problems are their problems. Business is not in the business of Truth; it is in the business of business. So why should we expect the science embedded within business to have a straightforward entitlement to the notion of Truth? The same question applies to the science embedded in the State’s exercise of power. Knowledge speaks through institutions; it is embedded in the everyday practices of social life; and if the institutions and the everyday practices are in trouble, so too is their knowledge. Given the relationship between the order of knowledge and the order of society, it’s no surprise that the other Big Thing now widely said to be in Crisis is liberal democracy. The Hobbesian Cui bono? question (Who benefits?) is generally thought pertinent to statecraft and commerce, so why shouldn’t there be dispute over scientific deliverances emerging, and thought to emerge, from government, business, and institutions advertising their relationship to them?

A chewy report from the trenches of epistemology. Go read it all.

The Greimas square-dance

More KSR on anti-anti-utopianism, this time at Commune Magazine:

Clearly we enter here the realm of the ideological; but we’ve been there all along. Althusser’s definition of ideology, which defines it as the imaginary relationship to our real conditions of existence, is very useful here, as everywhere. We all have ideologies, they are a necessary part of cognition, we would be disabled without them. So the question becomes, which ideology? People choose, even if they do not choose under conditions of their own making. Here, remembering that science too is an ideology, I would suggest that science is the strongest ideology for estimating what’s physically possible to do or not do. Science is AI, so to speak, in that the vast artificial intelligence that is science knows more than any individual can know—Marx called this distributed knowing “the general intellect”—and it continually reiterates and refines what it asserts, in an ongoing recursive project of self-improvement.

That’s the dovetail I didn’t know I was looking for  that connects to this recent NYT longread on Oncle Latour:

Crowded into the little concrete room, we were seeing gravity as Latour had always seen it — not as the thing in itself, nor as a mental representation, but as scientific technology allowed us to see it. This, in Latour’s view, was the only way it could be seen. Gravity, he has argued time and again, was created and made visible by the labor and expertise of scientists, the government funding that paid for their education, the electricity that powered up the sluggish computer, the truck that transported the gravimeter to the mountaintop, the geophysicists who translated its readings into calculations and legible diagrams, and so on. Without this network, the invisible waves would remain lost to our senses.

Consider the possibility

I’ve spent more time than I’d like to admit hanging around the online communities of the kind of people we are worried about reaching here, and I am here to tell you: They are using their critical thinking skills.

They are fully literate in concepts like bias and in the importance of interrogating sources. They believe very much in the power of persuasion and the dangers in propaganda and a great many of them believe that we are the ones who have been behaving uncritically and who have been duped. They think that we are the unbelieving victims of fraud.

Which is not to set up some kind of false equivalency between sides. But I do want us to consider the possibility that we don’t need to talk across that barrier, and that it might not be possible to talk across it. That we need to consider that if it’s true that vast swaths of the voting populace are unbelieving victims of fraud, that there’s not much we can do for them. That we may need instead to work to invigorate our allies, discourage our enemies, and save the persuasion for people right on the edge.

But, again, I’m saying all of this to you as someone who has not figured this out.

Tim Maly.

A threshold phenomenon

This whole fake news phenomenon is hugely important and historically significant. At the moment I’m completely captivated by the strength of an analogy between the Gutenberg era and the internet era, this rhythmic force coming out of the connection between them. Radical reality destruction went on with the emergence of [the] printing press. In Europe this self-propelling process began, and the consensus system of reality description, the attribution of authorities, criteria for any kind of philosophical or ontological statements, were all thrown into chaos. Massive processes of disorder followed that were eventually kind of settled in this new framework, which had to acknowledge a greater degree of pluralism than had previously existed. I think we’re in the same kind of early stage of a process of absolute shattering ontological chaos that has come from the fact that the epistemological authorities have been blasted apart by the internet. Whether it’s the university system, the media, financial authorities, the publishing industry, all the basic gatekeepers and crediting agencies and systems that have maintained the epistemological hierarchies of the modern world are just coming to pieces at a speed that no one had imagined was possible. The near-term, near-future consequences are bound to be messy and unpredictable and perhaps inevitably horrible in various ways. It is a threshold phenomenon. The notion that there is a return to the previous regime of ontological stabilization seems utterly deluded. There’s an escape that’s strictly analogous to the way in which modernity escaped the ancien régime.

Also:

My tendency is not to draw a huge distinction between [scientists and artists]. In all cases one’s dealing with the formulation or floatation of certain hypothesis. I am assuming that every scientist has an implicit science fiction. We all have a default of what we think the world is going to be in five years time, even if it’s blurry or not very explicit. If we haven’t tried to do science fiction, it probably means we have a damagingly conservative, inert, unrealistic implicit future scenario. In most cases a scientist is just a bad science fiction writer and an artist, hopefully, is a better one. There is, obviously, a lot of nonlinear dynamism, in that science fiction writers learned masses from scientists, how to hone their scenarios better, and also the other way around. Science fiction has shaped the sense of the future so much that everyone has that as background noise. The best version of the near future you have has been adopted from some science fiction writer. It has to be that science is to some extent guided by this. Science fiction provides its testing ground.

Nick Land.