This article originally appeared in Fortune.
What if we have been looking at climate change totally wrong? What if our greatest existential fear could instead offer hope for a brighter future.
Reports from the Intergovernmental Panel on Climate Change (IPCC), going back to 1995, have set the stage for how we think about climate change. First, climate impacts were thought to be a gradual and slowly increasing set of problems that would be manageable by well-understood technological and management processes. Second, mitigation was thought to be expensive and damaging to the economy. By that way of thinking, climate change would on balance be expensive to prevent and only moderately damaging.
As such, much of the motivation to respond has been driven by nonmarket considerations such as equity, the preservation of the natural world, and other benefits that are hard to price. As a lead author of the second and third IPCC reports, I understand this framework of beliefs quite well.
But what if these two core assumptions are wrong? What if the severity of climate change has been underestimated, and many of its harms are hard or impossible to adapt to? And what if the rate of potential technological progress is faster and the cost of alternative technologies cheaper than projected?
Then the fundamental economic equation changes, and decarbonization could be inexpensive compared to damages—or even benefit the global economy.
We likely live in that world.
How We Got Here
In the early days of climate science, research tended to focus on impacts far off in the future, at doubled preindustrial CO2 levels, often projected to the year 2100. Nearer-term projections were harder to characterize because the current climate is highly variable and scientists were conservative about projecting changes that were small relative to the system’s own variability. Many concluded that damages were far off in time and that socioeconomic systems would have time to adjust.
At the same time, those in the field being cautious (despite accusations of alarmism), projections of the cost of mitigation were quite high. Based on models from the early 2000s, assumptions about the rates of technological progress assumed slow rates of innovation. In fact, the challenge of mitigating emissions became associated with large expenditures and sacrifices, which would be a first for technological changes throughout history.
Since damages appeared to be far off, and mitigation costs high, the climate equation appeared to project a world where aggressive mitigation was expensive and thus unwarranted.
What about the world we live in now? In the U.S. alone, we see the effects of intensifying hurricanes and heat waves, more frequent and more intense wildfires, and persistent drought leading to the failure of water infrastructure—all of this in one of the wealthiest and most resilient nations in the world. These impacts are here because changes to extremes (very hot days and nights, droughts, severe weather) are accelerating. These are precisely the aspects of climate that even the most advanced climate models have challenges projecting.
At the same time, projections of the cost of renewable energy have proved equally misleading. Solar energy costs have decreased consistently faster than the International Energy Authority and IPCC models suggest, as summarized in one recent working paper. Absent unexpected barriers or counterproductive incentives, solar and wind power could allow nearly complete decarbonization in two decades, though this would require advances in supporting technologies, storage, smart grid, charging, and transportation.
This world has the opposite calculus from the one business leaders, politicians, and most advocates for the environment (with notable exceptions like Amory Lovins) have assumed. Damages are here. They are expensive. And they are very hard and (also) expensive to mitigate.
Meanwhile, solutions are now attractive, rapidly advancing, and constantly decreasing in cost. In this world, avoiding damages by aggressive and low-cost mitigation in the near term leads to a wealthier future rather than an impoverished one. There are still winners and losers in this world, but many more winners, including those who benefit from an intact rather than a catastrophically damaged natural world.
Predicting climate change is a formidable analytical feat, but gauging economic outcomes based on various levels of intervention is also highly uncertain, not least because it requires assumptions about human behavior and projections of physical phenomena and societal changes that are inherently very uncertain.
Reality Is Complicated
These flawed climate and economic forecasts shape how our lawmakers make climate policy. Forecasts by economists and climate scientists have largely been moderate—a function of scientific culture and the desire to blunt accusations of alarmism. Similarly, many economic estimates of climate change damage erroneously assumed that farming could easily adapt to more sustainable production and that capital would be efficiently reallocated to innovative companies. The reality, of course, is more complicated.
As the impacts of climate on the global economy and on human welfare become more obvious, we can expect more investment and the removal of more barriers and perverse incentives, as was evident in the Biden administration’s recent U.S. climate legislation.
Two wrongs—inaccurate estimates about both the pace and impact of climate change and the challenge and costs of mitigation—could produce one right: faster action on mitigation and adaptation to climate change. History tells us that the transition will not be easy, but evolving research suggests it should be feasible.
David Schimel, Ph.D., is a senior research scientist at NASA Jet Propulsion Laboratory and the chairman of Entelligent, which models data to help investors make better decisions. For his work as IPPC Convening Lead Author, he participated in the Nobel Peace Prize in 2007.