On January 8, 2014, at New York University in Brooklyn, there occurred a unique event in the annals of global warming: nearly eight hours of structured debate between three climate scientists supporting the consensus on manmade global warming and three climate scientists who dispute it, moderated by a team of six leading physicists from the American Physical Society (APS) led by Dr. Steven Koonin, a theoretical physicist at New York University. The debate, hosted by the APS, revealed consensus-supporting climate scientists harboring doubts and uncertainties and admitting to holes in climate science – in marked contrast to the emphatic messaging of bodies such as Intergovernmental Panel on Climate Change (IPCC).
At one point, Koonin read an extract from the IPCC’s Fifth Assessment Report released the previous year. Computer model-simulated responses to forcings – the term used by climate scientists for changes of energy flows into and out of the climate system, such as changes in solar radiation, volcanic eruptions, and changes in the concentrations of greenhouse gases in the atmosphere – “can be scaled up or down.” This scaling included greenhouse gas forcings.
Some forcings in some computer models had to be scaled down to match computer simulations to actual climate observations. But when it came to making centennial projections on which governments rely and drive climate policy, the scaling factors were removed, probably resulting in a 25 to 30 percent over-prediction of the 2100 warming.
The ensuing dialogue between Koonin and Dr. William Collins of the Lawrence Berkeley National Laboratory – a lead author of the climate model evaluation chapter in the Fifth Assessment Report – revealed something more troubling and deliberate than holes in scientific knowledge:
Dr. Koonin: But if the model tells you that you got the response to the forcing wrong by 30 percent, you should use that same 30 percent factor when you project out a century.
Dr. Collins: Yes. And one of the reasons we are not doing that is we are not using the models as [a] statistical projection tool.
Dr. Koonin: What are you using them as?
Dr. Collins: Well, we took exactly the same models that got the forcing wrong and which got sort of the projections wrong up to 2100.
Dr. Koonin: So, why do we even show centennial-scale projections?
Dr. Collins: Well, I mean, it is part of the [IPCC] assessment process.
Koonin was uncommonly well-suited to lead the APS climate workshop. He has a deep understanding of computer models, which have become the workhorses of climate science. As a young man, Koonin wrote a paper on computer modeling of nuclear reaction in stars and taught a course on computational physics at Caltech. In the early 1990s, he was involved in a program using satellites to measure the Earth’s albedo – that is, the reflection of incoming solar radiation back into space. As a student at Caltech in the late 1960s, he was taught by Nobel physicist Richard Feynman and absorbed what Koonin calls Feynman’s “absolute intellectual honesty.”
On becoming BP’s chief scientist in 2004, Koonin became part of the wider climate change milieu. Assignments included explaining the physics of man-made global warming to Prince Philip at a dinner in Buckingham Palace. In 2009, Koonin was appointed an under-secretary at the Department of Energy in the Obama administration.
The APS climate debate was the turning point in Koonin’s thinking about climate change and consensus climate science (“The Science”).
“I began by believing that we were in a race to save the planet from climate catastrophe,” Koonin writes in his new book, “Unsettled: What Climate Science Tells Us, What It Doesn’t, And Why It Matters.”
“I came away from the APS workshop not only surprised, but shaken by the realization that climate science was far less mature than I had supposed.”
“Unsettled” is an authoritative primer on the science of climate change that lifts the lid on The Science and finds plenty that isn’t as it should be.
“As a scientist,” writes Koonin, “I felt the scientific community was letting the public down by not telling the whole truth plainly.”
Koonin’s aim is to right that wrong.
Koonin’s indictment of The Science starts with its reliance on unreliable computer models. Usefully describing the earth’s climate, writes Koonin, is “one of the most challenging scientific simulation problems.” Models divide the atmosphere into pancake-shaped boxes of around 100km wide and one kilometer deep. But the upward flow of energy from tropical thunder clouds, which is more than thirty times larger than that from human influences, occurs over smaller scales than the programmed boxes. This forces climate modellers to make assumptions about what happens inside those boxes. As one modeller confesses, “it’s a real challenge to model what we don’t understand.”
Inevitably, this leaves considerable scope for modelers’ subjective views and preferences. A key question climate models are meant to solve is estimating the equilibrium climate sensitivity of carbon dioxide (ECS), which aims to tell us by how much temperatures rise from a doubling of carbon dioxide in the atmosphere. Yet in 2020, climate modelers from Germany’s Max Planck Institute admitted to tuning their model by targeting an ECS of about 3° Centigrade. “Talk about cooking the books,” Koonin comments.
The proof of the pudding, as they say, is in the eating. Self-evidently, computer projections can’t be tested against a future that’s yet to happen, but they can be tested against climates present and past. Climate models can’t even agree on what the current global average temperature is. “One particularly jarring feature is that the simulated average global surface temperature,” Koonin notes, “varies among models by about 3°C, three times greater than the observed value of the twentieth century warming they’re purporting to describe and explain.”
Another embarrassing feature of climate models concerns the earlier of the two twentieth-century warmings from 1910 to 1940, when human influences were much smaller. On average, models give a warming rate of about half of what was actually observed. The failure of the latest models to warm fast enough in those decades suggest that it’s possible, even likely, that internal climate variability is a significant contributor to the warming of recent decades, Koonin suggests. “That the models can’t reproduce the past is a big red flag – it erodes confidence in their projections of future climates.” Neither is it reassuring that for the years after 1960, the latest generation of climate models show a larger spread and greater uncertainty than earlier ones – implying that, far from advancing, The Science has been going backwards. That is not how science is meant to work.
The second part of Koonin’s indictment concerns the distortion, misrepresentation, and mischaracterization of climate data to support a narrative of climate catastrophism based on increasing frequency of extreme weather events. As an example, Koonin takes a “shockingly misleading” claim and associated graph in the United States government’s 2017 Climate Science Special Report that the number of high-temperature records set in the past two decades far exceeds the number of low-temperature records across the 48 contiguous states. Koonin demonstrates that the sharp uptick in highs over the last two decades is an artifact of a methodology chosen to mislead. After re-running the data, record highs show a clear peak in the 1930s, but there is no significant trend over the 120 years of observations starting in 1895, or even since 1980, when human influences on the climate grew strongly. In contrast, the number of record cold temperatures has declined over more than a century, with the trend accelerating after 1985.
Notes Koonin, “temperature extremes in the contiguous U.S. have become less common and somewhat milder since the late nineteenth century.” Similarly, a key message in the 2014 National Climate Assessment of an upward trend in hurricane frequency and intensity, repeated in the 2017 assessment, is contradicted 728 pages later by a statement buried in an appendix stating that there has been no significant trend in the global number of tropical cyclones “nor has any trend been identified in the number of U.S. land-falling hurricanes.”
That might surprise many politicians.
“Over the past thirty years, the incidence of natural disasters has dramatically increased,” Treasury secretary Janet Yellen falsely asserted last month in a pitch supporting the Biden administration’s infrastructure package. “We are now in a situation where climate change is an existential risk to our future economy and way of life,” she claimed.
The sacrifice of scientific truth in the form of objective empirical data for the sake of a catastrophist climate narrative is plain to see. As Koonin summarizes the case:
“Even as human influences have increased fivefold since 1950 and the globe has warmed modestly, most severe weather phenomena remain within past variability. Projections of future climate and weather events rely on models demonstrably unfit for the purpose.”
Koonin also has sharp words for the policy side of the climate change consensus, which asserts that although climate change is an existential threat, solving it by totally decarbonizing society is straightforward and relatively painless.
“Two decades ago, when I was in the private sector,” Koonin writes, “I learned to say that the goal of stabilizing human influences on the climate was ‘a challenge,’ while in government it was talked about as ‘an opportunity.’ Now back in academia, I can forthrightly call it ‘a practical impossibility.’”
Unlike many scientists and most politicians, Koonin displays a sure grasp of the split between developed and developing nations, for whom decarbonization is a luxury good that they can’t afford. The fissure dates back to the earliest days of the U.N. climate process at the end of the 1980s. Indeed, it’s why developing nations insisted on the U.N. route as opposed to an intergovernmental one that produced the 1987 Montreal Protocol on ozone-depleting substances.
“The economic betterment of most of humanity in the coming decades will drive energy demand even more strongly than population growth,” Koonin says.
“Who will pay the developing world not to emit? I have been posing that simple question to many people for more than fifteen years and have yet to hear a convincing answer.”
The most unsettling part of “Unsettled” concerns science and the role of scientists.
“Science is one of the very few human activities – perhaps the only one – in which errors are systematically criticized and fairly often, in time, corrected,” Karl Popper wrote nearly six decades ago.
That condition does not pertain in climate science, where errors are embedded in a political narrative and criticism is suppressed. In a recent essay, the philosopher Matthew B. Crawford observes that the pride of science as a way of generating knowledge – unlike religion – is to be falsifiable. That changes when science is pressed into duty as authority in order to absolve politicians of responsibility for justifying their policy choices (“the science says,” we’re repeatedly told). “Yet what sort of authority would it be that insists its own grasp of reality is merely provisional?” asks Crawford. “For authority to be really authoritative, it must claim an epistemic monopoly of some kind, whether of priestly or scientific knowledge.”
At the outset of “Unsettled,” Feynman’s axiom of absolute intellectual honesty is contrasted with climate scientist Stephen Schneider’s “double ethical bind.” On the one hand, scientists are ethically bound by the scientific method to tell the truth. On the other, they are human beings who want to reduce the risk of potentially disastrous climate change.
“Each of us has to decide what the right balance is between being effective and being honest,” Schneider said.
“Being effective” helps explain the pressure on climate scientists to conform to The Science and the emergence of a climate science knowledge monopoly. Its function is, as Crawford puts it, the manufacture of a product – political legitimacy – which, in turn, requires that competing views be delegitimized and driven out of public discourse through enforcement of a “moratorium on the asking of questions.” This sees climate scientist gatekeepers deciding who can and cannot opine on climate science. “Please, save us from retired physicists who think they’re smarter and wiser than everyone in climate science,” tweeted Gavin Schmidt, NASA acting senior climate advisor, about Koonin and his book. “I agree with pretty much everything you wrote,” a chair of a university earth sciences department tells Koonin, “but I don’t dare say that in public.” Another scientist criticizes Koonin for giving ammunition to “the deniers,” and a third writes an op-ed urging New York University to reconsider Koonin’s position there. It goes wider than scientists. Facebook has suppressed a “Wall Street Journal” review of “Unsettled.” Likewise, “Unsettled” remains unreviewed by the “New York Times,” the “Washington Post” (though it carried an op-ed by Marc Thiessen based on an interview with Koonin) and other dailies, which would prefer to treat Koonin’s reasoned climate dissent as though it doesn’t exist.
The moratorium on the asking of questions represents the death of science as understood and described by Popper, a victim of the conflicting requirements of political utility and scientific integrity. Many scientists take this lying down. Koonin won’t. For his forensic skill and making his findings accessible to non-specialists, Koonin has written the most important book on climate science in decades.
* * *