Couple ‘graphs from Julian Bleecker, here, which manage to put fairly succinctly an argument about futuring which I first found myself trying and failing to make ten, maybe fifteen years ago:
Predicting things feels like a setup for bad behavior. It feels weird trying to anticipate what’s going to happen “next” or down the road. It’s hard to explain. It feels a bit like the phenomenon where you tell yourself something enough times with enough conviction that you are challenged to imagine anything else. Also predictions feel transactional. Someone wants that prediction in order to make a proposition bet on something happening and all of a sudden, it’s not about imagining possibilities richly, it’s about being right, whatever the heck that means. I guess it means placing the right financial bet on a possible outcome and reaping the rewards thus.
That stands on its own, but the second one manages to underscore it without pointing the finger too hard:
Statistics might be useful when bets are being placed on possible near future worlds. When one doesn’t care about the lived experiences of that world except insofar as one is attempting to place (typically financial) bets on outcomes.
The angle that always interested me was what I took to be the absurdity of rating people (not exclusively futurists) on the percentage of things they “predicted” correctly—a bugbear I probably picked up in the sf world first, thanks to the ubiquity (mostly faded now, thankfully) of such claims about Asimov, Heinlein, whoever. And it still bugs me now, I find, having started to think about it… because here’s the thing: your supposed skill or luck at predicting is only (supposedly) verifiable after the point at which the certainty of your prediction would have been useful*, and that’s true of each particular prediction as well as your predictions considered as a set (whether carefully curated for positives or not).
It baffled me for ages why anyone would want to trust in those odds, however calculated… which, with hindsight, is because I was still at that point a lot more innocent about the ideology of commerce than I thought I was. Bleecker—with whom I do not by any means agree with on everything—has done a lot more time at that particular coalface; that’s presumably why he can name (or at least describe) this phenomenon so much more successfully than I can, and possibly why he’s more willing than I am to accept it as part of life’s rich tapestry.
Selah—maybe we choose our paths, maybe our paths choose us. Maybe there’s no substantive difference in those two options. But back to the main point here: you can disagree with me about the utility (or otherwise) of predictions, and you will be neither the first or the last to do so, and I’ll shrug it off (because experience dictates that this, another dogmatic catechism of the cult of Number Go Up, is effectively impossible to argue with anyway, and because I believe—or should I say predict?—that my position will be borne out by events in the long run).
But when it comes to arguing for the nimbleness to outcomes, expected or otherwise, that comes from the exercise of the imagination that Bleecker is extolling here, well, that’s a hill I’ll gladly die on—indeed, it’s the one where my flag’s been planted for a decade already.
[ * Of course, there’s a gotcha lurking in here that applies to Cassandra “shouldn’t” types such as myself, which would be fun and useful to sit down and work through at some point. Another one for list. ]