An uber-optimistic view of the future
Maybe it never truly went away. But these days techno-optimism—the kind that raged in the late 1990s and early 2000s and then dried up and turned to pessimism during the last decade—is once again bubbling up. The pessimism over the real-world impacts of apps and social media has turned into unbounded hope—at least among the tech elite and the venture capital investor class—that new technologies will solve our problems.
The Exponential Age, by tech investor and writer Azeem Azhar, is the latest celebration of the world-changing impact of computing technologies (including artificial intelligence and social media), biotechnology, and renewable energy. Azhar meticulously and smartly makes his case, describing the growth of what he calls exponential technologies—ones that rapidly and steadily improve in price and performance every year for several decades. He writes that “new technologies are being invented and scaled at an ever-faster pace, all while decreasing rapidly in price.”
To his credit, Azhar duly notes the problems arising from the fast transformations brought about by these technologies, most notably what he calls the “exponential gap.” Big tech corporations like Amazon and Google are gaining great wealth and power from the technologies. But other companies and many institutions and communities “can only adapt at an incremental pace,” he writes. “These get left behind—and fast.”
Yet his enthusiasm remains obvious.
For Azhar the story begins in 1979, when he was a seven-year-old in Zambia and a neighbor brought home a build-it-yourself computer kit. He then retells the familiar, yet still gripping, history of how those early products kick-started the PC revolution (an interesting side note is his description of the mostly lost-to-history Sinclair ZX81—his first computer, bought for £69 two years later after his family moved to a small town outside London). We know the rest. The explosion of PCs—young Azeem and his family soon graduated to the Acorn BBC Master, a popular home computer in the UK—led to the World Wide Web, and now our lives are being transformed by artificial intelligence.
It’s hard to quibble with the argument that computing technologies have grown exponentially. Moore’s Law has defined such growth for generations of technologists. It has meant, as Azhar points out, that by 2014 the cost of a transistor was only a few billionths of a dollar, versus around $8 in the 1960s. And that has changed everything, fueling the rapid rise of the internet, smartphones, and AI.
Essential to Azhar’s claim for the dawning of a new age, however, is that a far broader set of technologies exhibit this exponential growth. Economists call fundamental advances that have broad economic effects “general-purpose technologies”; think of the steam engine, electricity, or the internet. Azhar suspects that cheap solar power, bioengineering techniques such as synthetic biology, and 3D printing could be just such technologies.
He acknowledges that some of these technologies, particularly 3D printing, are relatively immature but argues that as prices drop, demand will grow quickly and the technologies will evolve and find markets. Azhar concludes: “In short, we are entering an age of abundance. The first period in human history in which energy, food, computation, and many resources will be trivially cheap to produce. We could fulfill the current needs of humanity many times over, at ever-declining economic cost.”
Maybe. But frankly, such uber-optimism takes a great leap of faith, both in the future power of the technologies and in our ability to use them effectively.
Sluggish growth
Our best measurement of economic progress is productivity growth. Specifically, total factor productivity (TFP) measures the role of innovation, including both management practices and new technologies. It isn’t a perfect gauge. But for now, it’s the best metric we have to estimate the impact of technologies on a country’s wealth and living standards.
Starting around the mid-2000s, TFP growth became sluggish in the US and many other advanced countries (it has been particularly bad in the UK), despite the emergence of our brilliant new technologies. That slowdown came after a multi-year growth spurt in the US in the late 1990s and early 2000s, when computers and the internet boosted productivity.
No one is sure what is causing the doldrums. Perhaps our technologies are not nearly as world-changing as we think, at least compared with earlier innovations. The father of techno-pessimism in the mid-2010s, Northwestern University economist Robert Gordon, famously showed his audience images of a smartphone and a toilet; which would you rather have? Or perhaps we don’t accurately capture the economic benefits of social media and free online services. But the most likely answer is simply that many businesses and institutions are not adopting the new technologies, particularly in sectors like health care, manufacturing, and education.
The technologies that we’re so impressed by, such as synthetic biology and 3D printing, date back decades. The pipeline needs constant refreshing.
It’s not necessarily a reason for pessimism. Maybe it will just take time. Erik Brynjolfsson, a Stanford economist and a leading expert on digital technologies, predicts that we are at the beginning of a “coming productivity boom.” He argues that most of the world’s advanced economies are near the bottom of a productivity J-curve. Many businesses are still struggling with new technologies, such as AI, but as they get better at taking advantage of the advances, overall productivity growth will take off.
It’s an optimistic take. But it also suggests that the trajectory of many new technologies is not a simple one. Demand matters, and markets are fickle. You need to look at why people and businesses want the innovation.
Take synthetic biology. The idea is as simple as it is compelling: rewrite the genetic code of microorganisms, whether bacteria or yeast or algae, so they produce the chemicals or materials you desire. The dream wasn’t exactly new at the time, but in the early 2000s proponents including Tom Knight, an MIT computer scientist turned biologist, helped popularize it, especially among investors. Why not treat biology as a simple engineering challenge?
With huge fermentation vats of these programmed microbes, you could make plastics or chemicals or even fuels. There would be no need for petroleum. Simply feed them sugar extracted from, say, sugarcane, and you could mass-produce whatever you need.
In the late 2000s several startups, including Amyris Biotechnologies and LS9, engineered the genetics of microbes to make hydrocarbon fuels intended to replace gasoline and diesel. Synthetic biology, it seemed, was on the verge of revolutionizing transportation. But in a few years, the dream was mostly dead. Amyris is now focused on making ingredients for skin creams and other consumer beauty products. LS9 sold off its holdings in 2014.
The market woes of synthetic biology continue to this day. Earlier this year, one of the leading companies in the field, Zymergen, suffered a financial setback as its product, a plastic made for use in folding smartphones, failed to gain traction. Its customers, the company said, were having “technical issues” integrating the plastic into their existing manufacturing processes.
The failures are not a condemnation of synthetic biology. A smattering of products are beginning to appear. Despite the commercial mistakes, the field’s future is undeniably bright. As the technology improves, aided by advances in automation, machine learning, and computing, the costs of creating tailored bugs and using them for mass production will surely drop.
But for now, synthetic biology is far from transforming the chemical industry or transportation fuels. Its progress over the last two decades has looked less like exponential growth and more like the staggering first steps of a child.
History lessons
I asked Carlota Perez, a social scientist who has written widely on technological revolutions and whom Azhar credits in his book as “instrumental” in helping him think about the relationship between technology and economics, how we can have such impressive breakthroughs and not see more productivity growth.
The answer is simple, says Perez: “All technological revolutions have gone through two different periods—the first in which productivity growth is seen in the new part of the economy, and the second, when the new technologies spread across the whole economy, generating synergies and bringing general productivity increases.”
Perez says we’re now in the period in which different industries are faring very differently. She adds, “The question is how do we get to the point where we have the productivity of the whole economy growing synergistically?”
Perez is a very different kind of techno-optimist from the free-market ones often heard in Silicon Valley. To her, it’s essential that governments create the right incentives to encourage the embrace of new technologies, including environmentally cleaner ones, using such tools as appropriate taxes and regulations.
“It’s all up to government,” she says. “Companies are not going in the green direction because they don’t need to—because they’re making money with what they’re doing. Why should they change? It is only when you can no longer be profitable doing what you’re doing [that] you use the new technologies to invest and innovate in new directions.”
But Perez says that “the amount of innovation in gestation—that is, in the wings—is almost unbelievable.” And, she says, once prompted by the right government policies and support, technological revolutions can happen quickly.
None of this is inevitable, however. There is certainly no assurance that governments will act. One worry is today’s lack of support for research. Our amazing new technologies might be poised to change the economy, but their growth and expansion must be bolstered by ever more new ideas and continued technological advances. After all, the origins of the technologies we’re so impressed by these days, such as synthetic biology and 3D printing, date back decades. The pipeline needs constant refreshing.
John Van Reenen, an economist at the London School of Economics and MIT, and his collaborators have shown that research productivity itself is slowing as “new ideas get harder to find.” At the same time, the US and many other Western governments have decreased their support for R&D as a proportion of GDP over the last few decades; in the mid-1960s, US federal R&D funding relative to GDP was three times what it is today. The US doesn’t have to return to such high levels, he says, “but standing still is not an option.” That would, says Van Reenen, cause TFP growth and economic progress to stagnate.
There are some signs the US is moving in the right direction. President Biden campaigned on promises to increase federal support for R&D by hundreds of billions over his first term. But getting Congress to embrace this has already been a challenge.
“It’s a choice we face,” says Van Reenen. “It’s all come back to the politics. Are we prepared to make serious investments?”
And that is where reluctant optimists like Van Reenen and uber-optimists like Azhar converge. I asked Azhar just how confident he is about his book’s prediction of “an age of abundance.” He said: “I’m optimistic about the progress of the technology, but I’m much more realistic, bordering on pessimistic, around the governance of the technology. That’s the bigger part of the fight.”