Well, I’ve done it. I’ve burned out my backlog. Scribbled this out on the plane and cleaned it up afterwards, hopefully it makes sense.

Futures (By Rate-of-Change)

Before moving onto a new sequence of ideas, I wanted to play with some other notes I have, about views of futurism.

Quick pick-me-ups before I get started:

In this post, I wanted to scribble out a few other kinds of futures (very broadly speaking!) just to kind of help myself (and you, my poor reader!) to zoom back out from Spengler’s cramped, grey, uniquely German space.

The central metric that I tried to order these futures by is the “rate of technological change”, and whether that change is increasing or decreasing human productivity.

Here’s my spectrum of futures by productivity:

Apocalyptic: Civilization is crippled irreparably. Productivity is reduced to near-zero over the long-term (centuries).

Declinist: Standard of living will reduce over a long-term. Productivity is greatly reduced, and collective human effort returns to more fundamental Maslowian needs.

Hackstable (Incrementalist): (“Hackstable” stolen from Venkatesh Rao). A familiar rate of change, making small adjustments to evolve kludgy systems to meet local needs. Productivity is nearly the same, or if it is increasing/decreasing, its effectiveness relative to its goals is roughly the same.

Accelerationist: The rate of measurable improvement to quality of life continues or increases. Human productivity flourishes.

Singularitarian: Recognizable civilization is outmoded in a jump of productivity. Rate of change is vastly increased and ever-increasing.

 

A Survey of Futures

Not a complete census of attitudes towards futures, obviously, just a starting point. In rough order from lowest to highest productivity/rate of technological proliferation.

 

Abrupt Collapse: Truly apocalyptic. Sudden large-scale crisis, by technology or by mega-scale natural disaster. There are many existential risks to civilization/humanity/earth championed by many different people, couldn’t find a single prophet. Existential risk itself is an interesting field (note: this article is the one that hooked me on Aeon).

Systemic/Environmental Failure: Apocalypse/Declinist. Rolling crises brought about by drastic system changes that beat the global system down- global water crisis, food production/distribution crisis, catastrophic jumps in climate, high-impact pandemic, global economic collapse. The decay of civilizational gatekeeper institutions. Generally when these things actually happen, the threshold does not initiate immediate apocalypse; however, an unfortunate coupling of massive crises may be vastly catastrophic.

Catabolic Collapse (Greer): Pure, Spenglerian Decline. How Civilizations Fall: A Theory of Catabolic Collapse describes it all succinctly. Very slow, longterm (over centuries!) but inevitable decline. There is a glimmer of hope in likely births of new civilizations, but without easily accessible fossil fuels an equivalently high-energy society would be difficult to achieve again for eons.

Stagnated Future (Cowen): Short-term Declinist, longer-term optimist. “The Great Stagnation” is the canonical text. Max Levchin and Peter Thiel are grouped into this school in my view- even though they diagnose the problem differently they come to the same conclusion, that for whatever reason we have experienced a dip in innovation- a repairable dip, ultimately, but an empirical dip nonetheless. Human productivity has largely increased regardless because old ideas are still be re-appropriated, and people are still working somewhat productively in these old paradigms.

”Kludgeocratic” Future (Teles): On the Hackstable/Declinist border. I’ve talked about Kludge at length, drawing from Steven Teles’ work. In many circles I’m familiar with this is fretted over as a default future. In a Kludgeocratic future we continue to make small, incremental changes to large-scale political and technological systems without making fundamental changes, even as hidden technical debts and obscured backdoors make the system more and more fragile. We continue playing civilizational Jenga for as long as we can [presumably before the complexity of the system crushes us all with its incomprehensibility- the desperate hope being that Skynet will learn to juggle this sysem for us?].

”Normal”/”Uncool” Future (Rao, Taleb): Hackstable. A future of incremental improvement. A “hackstable” viewpoint where incremental improvement and tweaking allows for local needs to be met without much fanfare or a major revolution to underlying systems. The major texts on the matter: “Hacking the Nondisposable Planet” defines ‘hackstable’. Welcome to the Future Nauseous describes a world where radical new technologies hide their radicalness in the user’s experience, and introduces the idea of the “Manufactured Normalcy Field”- this allowance is what separates the “normal” future from the unstable “Kludgeocratic” future. Nassim Taleb’s “The Future Will Not Be Cool” suggests even further that new technology might simply mitigate older technology in our user experience, returning us to comfortable, more ancient/natural patterns of behavior while still allowing for quiet new benefits (this isn’t even his main point, but that’s an uncommon idea that comes across).

Classic Projection: Accelerationist. The default folk future of our culture. Our current aesthetic is exaggerated further. Most utopias and dystopias feature recognizable governments and cultures, character behaviors, and technologies that directly descend from our own. The real noticeable difference is usually that economic rules have changed- possible but economically unfeasible technologies are now feasible. Twenty years in the future is more different from today than today is from twenty years in the past- enough to produce Future Nausea but not Future Shock.

Accelerated Change (Kurzweil): On the Accelerationist/Singularitarian divide depending on strength. Perhaps the most popular and least directed of the Singularitarian schools. “Our intuitions about change are linear; we expect roughly as much change as has occurred in the past over our own lifetimes. But technological change feeds on itself, and therefore accelerates. Change today is faster than it was 500 years ago, which in turn is faster than it was 5000 years ago. Our recent past is not a reliable guide to how much change we should expect in the future.”

Intelligence Explosion (Yudkowsky): Singularitarian. “Intelligence has always been the source of technology. If technology can significantly improve on human intelligence – create minds smarter than the smartest existing humans – then this closes the loop and creates a positive feedback cycle. What would humans with brain-computer interfaces do with their augmented intelligence? One good bet is that they’d design the next generation of brain-computer interfaces. Intelligence enhancement is a classic tipping point; the smarter you get, the more intelligence you can apply to making yourself even smarter.”

Event Horizon (Vinge): Singularitarian. The original “Singularity” concept. “For the last hundred thousand years, humans have been the smartest intelligences on the planet. All our social and technological progress was produced by human brains. Shortly, technology will advance to the point of improving on human intelligence (brain-computer interfaces, Artificial Intelligence). This will create a future that is weirder by far than most science fiction, a difference-in-kind that goes beyond amazing shiny gadgets. […] the future after the creation of smarter-than-human intelligence is absolutely unpredictable.”