Jon Hillis

Posted on Nov 01, 2022Read on Mirror.xyz

Mega-cycles of technological revolutions

I’m reading Carlota Perez’s excellent book Technological Revolutions and Financial Capital after seeing it referenced in several essays by Venkatesh Rao and others. It’s a great read if you don’t take it too literally and just try to play the game of fitting patterns to her framework.

She lays out a historical narrative from 1771 to present, walking through five “technological revolutions”. She then goes on to walk through a framework for mapping each of these revolutions (loosely) into a “technological cycle” that follows an S-curve:

perez-cycles-final

This cycle has four phases: irruption, frenzy, synergy, and maturity.

Irruption phase: Revolutionary inventions are created, intense funding of new tech, new industries appear.

Frenzy phase: Speculation runs rampant as financial capital and production capital decouple (aka “bubble”).

Synergy phase: New technology and financialization leads to inequality and social/political turmoil. Regulation increases as a response.

Maturity phase: Market saturation, decreasing return on investment, new horizons sought.

Computers are textile mills for the mind

Interestingly, Technological Revolutions and Financial Capital doesn’t have terribly much to say about the cycle happening when the book was published, which Perez calls “The Age of Information and Telecommunications”. But I think it’s possible that the “Age of Information and Telecommunications” is not merely a fifth cycle, but the first stage of the next mega-cycle: the Information Revolution.

The Industrial Revolution was the origin of the idea of harnessing power and machines for physical work, and the following three cycles built on that theme. The Ages of Steam/Railroad, Electricity/Steel, and Oil/Car evolved this basic premise using new sources of energy, materials, and innovations. The Information Revolution is different: it’s about harnessing data & compute power for mental work. In other words, computers are textile mills bicycles for the mind.

If the Information Revolution is analogous to the Industrial Revolution (and we take Perez’s whole premise and framework at face value) it’s likely that this isn’t just another Age, but the start of a true Revolution of the type that only comes every couple hundred years and spawns several subsequent Ages. But before we get into possible technological cycles of the future, let’s take a moment to walk through the one we’re living through and attempt to “catch up” Perez’s fifth cycle to the last twenty years of development and hindsight.

The first phase of the Information Revolution gave us an irruption of the fundamental technologies for compute, data storage, and networked communication and spawned desktop computers, supercomputers, and early prototypes of the internet. This Age birthed Silicon Valley (when it actually cared about silicon) and was defined by companies like Intel, Apple (the first time), and Microsoft.

Frenzy.com

Once people had computers, they wanted to know what to do with them. Sure, they could do math well and had some decent software for replacing office tasks, but people mostly just want to talk to each other. There was something called the internet that let people interact with each other via computers, but almost no one used it. My dad, who happened to be a computational biologist in academia in the 1980s, was unsurprisingly an early adopter. He was incensed the first time he received an impersonal email from a company and told them he would never buy their products again.

Then, after several experiments with graphical, user-friendly interfaces for creating and storing linked information, Tim Berners Lee finally hit the mark with the world wide web in 1991, and people scaled that basic idea up into, well, what we have today. The Age of the Internet and the WWW set off a frenzy of investment, a bubble of incredible proportions that suddenly burst in 2001, leaving scar tissue for a generation of entrepreneurs (as well as the seeds of future consumer software monopolies like Google, Amazon, Netflix, and Facebook).

For better or worse, the next generation did not inherit this scar tissue. Millennials, primarily in grade school during this implosion, saw less of the carnage and more of the synergistic future. I remember seeing it in the school computer lab in early 2007 as my friend and I were blown away with excitement by the keynote where Steve Jobs unveiled the iPhone.

The rise of ubiquitous computing

Technological Revolutions and Financial Capital came out in 2002, which happened to be right at the end of the Age of the Internet and WWW. Five years later, Steve Jobs showed the world the iPhone, which he jokingly introduced as three devices: “a widescreen iPod with touch controls, a revolutionary mobile phone, and an internet communicator”.

In other words: an intuitive new interface for human-computer interaction, the ability to take it everywhere, and the presumption that it will connect you more immediately to the world. It was the device that launched us into the Phase of the Smartphone and the Cloud. Or, alternately, the Phase of Ubiquitous Computing.

The Age of Ubiquitous Computing is built around the premise that everything should be a computer, and that your multitude of computers should be personalized and present everywhere you go. Smartphones are the killer device of ubiquitous computing, but the genre includes hardware ranging from smart-speakers to smart-fridges. In the same way the previous phase’s products were often prefixed with e-, this cycle is often prefixed with smart- (though ironically, the actual “smart” part doesn’t really come until later). Computers have become so deeply integrated with our lives, we’ve mostly stopped calling them computers.

One of the best ways to spot transitions is to watch how previously revolutionary technologies become infrastructure for the next phase. With the smartphone, the internet and the microchip disappeared from public view and became infra layers. Remember when there used to be an “Intel inside” sticker right on most computers? When was the last time you actually typed out https://www?

When you assume everyone is always carrying around at least one networked computer loaded with built-in sensors, you start building information technology differently. In order to maximize device capabilities, you end up storing most of the data off-device in server farms & updating the software remotely. Mobile social networks became dominant by providing an always-available audience and easy-to-generate content. Smartphones allow anyone to get paid for a single unit of work, as small as delivering a meal, whenever they want. They also allow for a level of personalized advertising and digital surveillance unimaginable a generation ago. Ultimately, they have facilitated a gaping fissure in our social, political, and economic systems.

Getting to the golden age

So that brings us up to present: an era of mass deployment of computers, the internet, and the uncountable inventions they have spawned. A period of great wealth inequality and new winners and losers in the global economic game. A time of immense social upheaval and change as software eats the world and sometimes we don’t like what comes out the other end.

Massive fortunes and new social orders have been built, but we are still missing what Perez describes as “a systemic articulation of the new regulatory framework and of the appropriate institutions”. And it seems about right to imagine the next decade as a period of financial, institutional, and regulatory recovery. Our financial systems, institutions, and regulatory bodies could sure use it right now.

What then, would a “golden age” of this cycle look like? Technologically, one natural evolution would be towards more abstract and direct human-computer interfaces, increased sensor omnipresence, and an expansion of cloud-computing-as-a-service infrastructure. The most tangible consumer product at the intersection of those three trends is virtual assistants. Alexa, Siri, and OK Google are almost the platonic ideal of ubiquitous computing: you can simply talk out loud and a computer is waiting to listen to you and search for information.

Of course, this level of omnipresence comes with clear tradeoffs in privacy, personalized advertising, and data surveillance—all of which will need new regulatory frameworks in the coming decades. Beyond privacy, we are already seeing the first signs of rethinking a wide range of government policies: gig employment, healthcare rights, monetary policy, drone warfare, etc.

Our institutions are evolving too, often in painful transitionary ways. Political movements build their followings on social media and the president uses twitter as the bully pulpit. Truth has been replaced by truthiness. Legacy corporations are being replaced by startups, university credentialism is failing, our social safety net is full of holes, the planet is warming, democracy is struggling, and America isn’t looking Great.

It’s hard to predict how we make it through this period of “institutional adjustment”, but it’s easy to imagine a few optimistic outcomes: politicians that engage in interactive policy-building with constituents, flexible and rewarding on-demand employment and education options for everyone, the re-establishment of scientific credibility, and solutions to climate change. Something to look forward to in the 2020s and 2030s.

A meta-mapping of mega-cycles?

If the deployment of computing, the internet, and the remaining untapped ways they will transform of our lives dominates the technological and social landscape through sometime in the 2030s, what comes next? Because the seeds for the next cycle are planted right about this time in the current cycle, it’s probably a good time to be asking that question.

Of course, we should ask this question with the humility that we are doing nothing more than playing a fun guessing game, and that both the predictions we make and the mappings we can create to Perez’s cycle are less prediction and more trying things on for size. With that disclaimer out of the way: it occurred to me as a read about the phases of Perez’s cycle that they bore a compelling resemblance to the first four historical technological cycles that she refers to in the book (slightly paraphrased for simplicity):

  1. The Industrial Revolution — Irruption of mechanical powered inventions

  2. Age of Steam and Railways — Frenzy as the world connected

  3. Age of Electricity and Steel — Synergy of power and materials

  4. Age of Oil and the Car — Maturity of the mechanical revolution

To generalize this idea (and probably massively overextend this framework), the ~50 year cycles or “Ages” that Perez analyzes are each part of a larger, ~200 year mega-cycle revolution like the Industrial Revolution or the Information Revolution which follows the same four-part cyclical pattern.

The rest of the Information Revolution

Perez’s first cycle, the Industrial Revolution, gives way to the second: the Age of Steam and Railways. In many ways, this second cycle was a natural extension of the more foundational first cycle. We took the idea of water-power generating physical work and evolved it into it’s logical conclusion: steam powering trains. So what’s the natural extension of ubiquitous computing and a near omni-present-and-prescient internet?

It’s a heck of a lot easier to design post-hoc narratives about these cycles than to understand them as they happen, but one obvious candidate for the next cycle is probably some form of artificial intelligence. The term artificial intelligence (especially when capitalized or abbreviated) is so thoroughly over-used as to render it a poor device for analysis. Let’s just call it the Age of Intelligent Computing. Much like the Turing test, I literally mean something that a human would describe as intelligent. While current virtual assistants play some clever tricks with voice-to-text translation and natural language processing, it’s frustratingly easy to get Alexa out of its depth of ability to answer questions.

But cycles don’t start with fully integrated consumer products, they start with proof-of-concept inventions and infrastructure. And while there’s a long and storied history of making big claims about “the future of AI” that never live up to the hype and lead to disillusionment, I think we’re seeing some compelling proofs-of-concept and signs of developing infrastructure.

Machine learning has been applied with particularly strong results in areas like computer vision, natural language processing and translation, and forecasting. The killer product of this era could be self-driving cars with Level 5 Autonomy. Experts and forecasters tend to put this in about the right spot in the cycle for intelligent computing to go mainstream, with prediction confidence ranges hovering around 2050.

By the latter part of this cycle, we would presumably see AI deployed as a comprehensive set of massive, universal consumer and enterprise products. At this point, humans would not be useful for many current jobs, but as always, we figure out how to create new ones. It’s not hard to imagine the social, economic, and political strife that (truly) smart computers will cause if we ever make them. As we try to overcome these challenges, one wild solution to the future of job retraining could be giving everyone their own AI from birth and teaches you & learns from you, developing a human-like relationship. The Young Lady’s primer from Diamond Age provides a good sci-fi representation of the type of product that could define the maturity phase of the Age of Intelligent Computing. On the other hand, the dark side of this type of future could end up with personal intelligent computing assistants that are more like slaves than teachers or friends—but that’s a societal debate we might be able to leave for the 2070s.

That leaves two more Ages in the Information Revolution — the synergy / maturity cycles, the ones that map to Electricity/Steel and Cars/Oil. It seems like biotechnology (particularly the use of intelligent computing to program DNA) seems like a possible path, but at this point we’ve probably gotten beyond the point of any coherent speculation. I’ll leave it your imagination.


[originally published Jun 30, 2020]