clock menu more-arrow no yes mobile

Filed under:

Why the next 20 years will see a lot less technological disruption than the past 20

Burlingham / Shutterstock

“The internet is still at the beginning of its beginning,” writes Wired co-founder and Silicon Valley guru Kevin Kelly in his new book The Inevitable. Kelly argues that adding machine intelligence to everyday objects — a process he calls “cognifying” — “would be hundreds of times more disruptive to our lives than the transformations gained by industrialization.”

Is he right?

Kelly’s extreme optimism represents one pole of this debate. At the opposite pole is economist Robert Gordon, who believes the IT revolution is basically over. In his new book The Rise and Fall of American Growth, Gordon documents the dramatic economic changes of the 20th century — electricity, cars, indoor plumbing, antibiotics — and predicts that nothing of that scale is on the horizon.

“The economic revolution of 1870 to 1970 was unique in human history, unrepeatable because so many of its achievements could happen only once,” Gordon writes. He acknowledges that since 1970 there have been big improvements in televisions, smartphones, and other information technologies. But he argues that many other aspects of the economy — food, clothing, transportation, health care, and so forth — have changed little since the 1970s and are unlikely to change much in the next couple of decades.

Kelly and Gordon don’t just have opposite predictions about the future — they represent opposite approaches to thinking about an uncertain future. Gordon has difficulty imagining how computers could continue to transform our lives, so he assumes they won’t. Kelly’s life has been so transformed by computers that he can’t imagine how anyone doesn’t see their continued potential.

Reality, of course, is likely to be somewhere between these extreme views. It’s hard to believe that the IT revolution will be “hundreds of times more disruptive” than the Industrial Revolution — or even to figure out what that would mean. At the same time, Gordon is too cavalier about dismissing technologies like self-driving cars that really do seem poised to have a big social and economic impact.

But ultimately, I think Gordon gets closer to the mark than Kelly does. Kelly has spent his career in Silicon Valley, the place that has reaped the biggest gains from the exponential improvements in computing power. So it’s natural to assume that any problem can be solved — and any industry can be disrupted, or at least wildly improved — if we just bring enough computing power to bear.

He even has a sort of rallying cry for his perspective. “Who knows? But it will come!” The line is tucked into a chapter where Kelly tries to imagine different goods and services after they’ve been “cognified” by computers. “Cognified knitting” is one of the possibilities he imagines. What does that mean? “Who knows?” Kelly writes. “But it will come!”

Indeed, “Who knows? But it will come!” has become the de facto rallying cry for a lot of recent Silicon Valley innovations with more hype than obvious applications, and it emerges out of a foundational experience: watching the internet be underestimated after it emerged, and then mocked after the tech bubble popped, only to change the world directly thereafter.

But it-will-come-ism has fallen flat in recent years, and I think it’s going to continue failing in the years to come. There are a number of industries — with health care and education being the most important — where there’s an inherent limit on how much value information technology can add. Because in these industries, the main thing you’re buying is relationships to other human beings, and those can’t be automated.

Technologists are very optimistic about the power of software

Blockbuster Targets Hispanic Market With Spanish-Language Movies
Blockbuster didn’t think it would get disrupted. It was wrong.
Photo by David Friedman/Getty Images

It’s not surprising that Silicon Valley — a place that grew rich and powerful by building the internet economy — is full of technology optimists. Silicon Valley elites aren’t just used to changing the world with software. They’re used to being underestimated as they do it.

When I interviewed venture capitalist Marc Andreessen a couple of years ago, for example, he told me about his experience as a young startup founder in the early 1990s trying to convince big companies to take the internet seriously. Established CEOs laughed at him when he argued that the internet could disrupt industries like music, retail, media, and telecommunications.

Obviously, no one is laughing now.

Many tech elites believe history is about to repeat itself, only on a much larger scale. Andreessen made the case in a 2011 article called “Why software is eating the world.”

Until that point, the internet had mostly disrupted businesses that dealt in information. Now, Andreessen argued, the tech sector was coming for the rest of the economy.

In 2011, it seemed like the signs of this second digital revolution were sprouting up all over. Airbnb was widely seen as a pioneer of a new “sharing economy,” with Uber and Lyft announcing ride-hailing services in 2012. 3D printing seemed poised to render conventional manufacturing obsolete. The “internet of things” promised to embed cheap, tiny wifi-connected computers in everyday objects.

Startups like Khan Academy and Udacity were promising to revolutionize the education market with online classes. Bitcoin seemed to offer a new digital foundation for the financial sector. An IBM supercomputer called Watson beat the world’s best Jeopardy players, and IBM vowed to apply the technology to medical diagnosis.

It all seemed like it could be a really big deal. The industries the internet has disrupted so far — music, news, mapmaking — add up to a relatively small fraction of the overall economy. If digital technology can disrupt sectors like health care, education, manufacturing, finance, and government, the economic benefits could be massive.

Kevin Kelly thinks this future is right around the corner. “70 percent of today’s occupations will be replaced by automation,” he writes. “Most of the important technologies that will dominate life 30 years from now have not yet been invented.”

Until recently, I was very sympathetic to this point of view. But recently I’ve become more skeptical. One thing that started to change my mind was reporting on (and then buying a house using) the real estate startup Redfin.

In its early years, Redfin seemed like exactly the kind of disruptive startup Andreessen and Kelly expected to transform the conventional economy. In a 2007 interview on 60 Minutes, Redfin CEO Glenn Kelman described real estate as “by far the most screwed-up industry in America” and vowed to do for it what Amazon had done for bookselling and eBay had done for the sale of collectibles.

Back then, Redfin was charging homebuyers about a third of what a conventional real estate agent would charge for buying a house. To turn a profit, Redfin had to offer customers much less access to human real estate agents.

But customers hated this early, bare-bones product. For most of them, a personal relationship with a human agent was the main attraction of hiring a real estate firm. Facing the biggest purchasing decision of their lives, customers wanted someone available to answer questions and walk them through the steps of the home-buying process.

So over the past decade, Redfin has hired more agents and dramatically raised its fees. Today, the company looks more like a conventional real estate agency — albeit an unusually tech-savvy one — than the scrappy, disruptive startup Kelman led a decade ago. The real estate business wasn’t as ripe for disruption as Kelman thought.

What real economic progress looks like

Model T Ford Replica Is Taken Up Ben Nevis
The Model T, one of history’s most disruptive innovations.
Photo by Jeff J Mitchell/Getty Images

Even if Kelman’s original vision had been more successful, Redfin would still have represented only an incremental improvement over conventional real estate services. It would have been cheaper and perhaps a bit more convenient, but it wouldn’t have fundamentally changed the process of buying a home.

And the same is true of a lot of recent startups that have aimed to disrupt conventional industries. Food delivery startups make it more convenient to order takeout. Uber and Lyft streamline the process of calling a cab. Zenefits provides a cheaper way for small businesses to manage payroll. Even Amazon mostly provides a cheaper and more convenient alternative to driving to the mall.

These are all perfectly good business ideas. The internet is creating lots of opportunities to squeeze inefficiencies out of the system. But Gordon’s book reminds us that this isn’t what a real technological revolution looks like.

The world inhabited by a typical American family in 1900 looked radically different from today’s world. Automobiles were expensive toys for the wealthy. Traveling from New York to Los Angeles required a train and took several days.

Washing machines, refrigerators, dishwashers, and vacuum cleaners were still in the future, making housework a back-breaking full-time job. Electric lighting was out of reach for most families, so they had to rely on dim and dangerous candles or kerosene lamps — and most simply didn’t try to do very much after dark.

Most homes lacked running water and flush toilets, leading to recurrent sanitation problems. And with no antibiotics and few vaccines, it was common for families to lose young children to infectious diseases.

By 1960, in contrast, a typical American family enjoyed a lifestyle that would be familiar to us today. Running water, flush toilets, electric lighting, cars, refrigerators, and washing machines were all commonplace. Deaths from infectious diseases like influenza, pneumonia, and polio were plunging. Ubiquitous cars and newly developed freeways meant that you could drive across town about as quickly as you can today (maybe faster at rush hour), and newly invented passenger jets could fly from New York to Los Angeles in five hours.

The rapid progress of the early 20th century depended on two factors. One was a series of technical breakthroughs in science, engineering, and medicine. But the other was the fact that in 1900, the human race had a bunch of big problems — dimly lit homes, slow transportation options, deadly diseases, a lot of tedious housework — that could be solved with new technologies.

Why health care and education are different

The situation today is different. America hasn’t completely conquered material wants, of course. There are still tens of millions of people — far too many — who struggle to afford the basics.

But unlike in the early 20th century, these Americans represent a minority of the population. Among the affluent majority, food is cheap and easy to buy, closet space is scarcer than clothes, refrigerators and washing machines are ubiquitous, and there are often as many cars in the driveway as adults in the house.

Instead, these more affluent Americans have a different set of worries. Can I get a house in a good school district? Can I afford to pay for child care? Can I afford health insurance coverage? Will I be able to send my kids to college?

Indeed, the trend can be seen graphically in this chart:

Timothy B. Lee / Vox

The chart shows how various goods and services have changed in price relative to the overall price level. Over the past four decades, manufactured products like clothing, toys, cars, and furniture have gotten more affordable. At the same time, services like education and medical care have gotten a lot more expensive.

These trends are connected. Technological progress (and increased trade with low-income countries like China) has pushed down the cost of manufactured goods. But families don’t need an infinite number of televisions, cars, or clothes. So they’ve taken the savings and plowed them into other sectors of the economy — sectors where technology can’t easily boost output. With demand often outstripping supply, the result has been rising tuition, rising housing costs in trendy neighborhoods, rising child care costs, and so forth.

At this point, you might object that this just shows the need for disruptive innovation in education and child care. After all, aren’t startups like Udacity and Khan Academy working to create online learning alternatives?

But disrupting the education industry will be hard for the same kind of reasons it was hard for Redfin to disrupt the real estate business. Udacity aims to streamline education by reducing the number of hours staffers spend grading papers, answering student questions, and so forth. But from the student’s perspective, time talking to professors, TAs, and administrators isn’t wasted — it’s an important part of the educational experience.

Much of the value people get from attending a four-year college comes from interaction with other people. People spend their college years forming a circle of friends and a network of acquaintances that often become invaluable later in their careers. They gain value from group study and extracurricular activities. There is real benefit from mentorship by professors.

The social experience of college also serves as a powerful motivator. An early, free, Udacity course, for example, had a dismal 4 percent completion rate. It’s hard to motivate yourself to study hard when you’re only interacting with a computer program. The process of having human instructors regularly checking assignments — and students comparing grades with their peers — is core to students’ success, especially for less disciplined kids. So parents who can afford to send their children to a conventional college are unlikely to choose a cut-rate online university instead.

The same point applies to health care. Even if AI gets better at diagnosing diseases, people are still going to want a human doctor around to answer questions about the diagnosis and possible treatment options, to make sure the patient’s overall treatment process stays on track, and provide a comforting bedside manner. And they’re going to want a human nurse — not a robot — to tend to their needs during their hospital stay.

None of this is to say that technology can’t add value to industries like education and health care. Technology is likely to serve as a complement to these conventional services. Software will continue to help doctors get better at their jobs by providing better software for scheduling, diagnostics, and so forth. And technology may also provide affordable alternatives for people who don’t have access to a traditional university or hospital.

But it’s unlikely that today’s schools and hospitals are headed for the fate of Borders or Kodak.

“Who knows? But it will come!”

Newest Innovations In Consumer Technology On Display At 2014 International CES
Home 3D printers are an impressive technology with few practical applications.
Photo by Justin Sullivan/Getty Images

So as long as technologists are merely finding ways to make it modestly cheaper or more convenient to do things people have been doing for decades, their impact on the overall economy will be necessarily modest. To have a bigger impact, they need to invent broad new product categories — which necessarily means finding big, unmet needs that can be addressed by new inventions.

Self-driving cars could be one example. Robert Gordon is dismissive of this emerging technology, and I think that’s a mistake. Autonomous vehicles will not only make people’s morning commutes more convenient, but they also have the potential to revolutionize the retail sector, change how people plan cities, and more.

Also, of course, in recent years we’ve seen a steady progression of entertainment and communications gadgets: DVD players, video game consoles, smartphones, and now VR headsets.

But outside the worlds of entertainment and communication, it’s hard to think of major new products in the recent past or likely in the near future. And Kelly doesn’t offer any plausible examples of big breakthroughs that could be on the horizon.

The one chapter of his book that focuses on this question is called “Cognifying.” It argues that embedding computer chips into everyday objects could dramatically change our lives. Yet his examples are not impressive:

  • “Cognified laundry,” he says, could embed computer chips in our clothes, allowing your washing machine to automatically choose the right settings.
  • “Cognified real estate” would “match buyers and sellers via an AI.” It’s not clear why consumers would like this better than the original version of Redfin.
  • “Cognified nursing” would outfit patients with sensors to allow customized care.
  • “Cognified knitting.” What does that mean? “Who knows?” Kelly writes. “But it will come!”

We’ve see the same faith in the transformational potential of computing power at work in discussions of other recent innovations that have generated a lot of buzz in the media but haven’t been big hits with consumers. For example, there’s been a lot of discussion about the “internet of things” — an effort to embed computer chips in everything from thermostats to light bulbs. These products have been on the market for about five years, yet it’s hard to think of any cases where the addition of computer chips and software has added a lot of value.

You can say the same about home 3D printing. 3D printers have carved out an important but still niche application for prototyping in industrial labs and universities. But early predictions that 3D printers would become standard equipment in people’s homes have not been borne out.

Similarly, when Bitcoin burst into the mainstream in 2013, there was a lot of speculation (including from me) that it could disrupt the financial sector. But three years (and more than $1 billion in venture capital investments) later, we seem to be no closer to finding practical applications for the technology.

Of course, it would be a mistake to say that because these technologies haven’t produced much value yet, they won’t do so in the future. As technologists are quick to point out, people underestimated revolutionary technologies like PCs and the internet in their early years too.

But the fact that so many of these efforts seem to be falling short of expectations makes me skeptical of the view that computing power will inevitably transform every sector of the economy. Computers have proven that they’re great at transforming industries — music, news, maps, phone calls, and so forth — that are fundamentally about collecting, processing, and distributing information.

But software companies are now entering industries — from health care and education to lightbulbs and thermostats — that are primarily about managing physical objects or human relationships rather than information. That’s a bigger challenge, and in many of these industries I think technology companies will discover there just isn’t much room for them to add value.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.