Futures can lie.
Because portrayals of the future are usually full of science-y things, we are inclined to see them as objective — as the product of something like the scientific method — instead of the marketing or political persuasion efforts they often are.
Making compelling futures is a difficult art to master. It demands all sorts of horizon-scanning, systemization and storytelling skills. No one with these hard-won skills uses them to weave visions of the possible without an agenda.
Anytime you’re presented with a future (or set of futures), it’s worth asking “What am I being asked to see, what am I being asked to un-see and who is being served here?”
My point is not that we ought to “politicize futurism.” It’s that futurism is inherently political, and has been from its first days. Futurism has always been used to push political and economic agendas. Only now, with a century of futurism behind us, many of those agendas are so taken for granted — so frequently woven into the visions of tomorrow that surround us — that they’re invisible to us.
And in democracies, hidden agendas are always the most pernicious. This is doubly the case when they’ve appropriated the mantle of scientific and technical authority.
More dangerous still is what the hidden agendas of futurism do to our societal ability to anticipate change. Headed into what is beyond doubt a period of tumultuous upheaval, we need good cultural and political understandings of the systems and processes at work. Yet our tools and institutions of foresight are almost all riddled with assumptions that are in many cases more than a century old and (despite their robot-chrome-radical-gloss) which serve the current political and economic structures. Out future, as I’ve said, is a thing of the past.
If it is, as Whitehead said, the business of the future to be dangerous, what does it say that so much futurism threatens the status quo so little?
Thanks for reading!
Alex also writes a free weekly letter sharing new ideas, short reviews and interesting links. Sign up here: