No Convincing Evidence That Extreme Wildfires Are Increasing

According to a new research study by scientists at the University of Tasmania, the frequency and magnitude of extreme wildfires around the globe more than doubled between 2003 and 2023, despite a decline in the total worldwide area burned annually. The study authors link this trend to climate change.

Such a claim doesn’t stand up to scrutiny, however. First, the authors seem unaware of the usual definition of climate change, which is a long-term shift in weather patterns over a period of at least 30 years. Their finding of a 21-year trend in extreme wildfires is certainly valid, but the study interval is too short to draw any conclusions about climate.

Paradoxically, the researchers mention an earlier 2017 study of theirs, stating that the 12-year period of that study of extreme wildfires was indeed too short to identify any temporal climate trend. Why they think 21 years is any better is puzzling!

Second, the study makes no attempt to compare wildfire frequency and magnitude over the last 21 years with those from decades ago, when there were arguably as many hot-burning fires as now. Such a comparison would allow the claim of more frequent extreme wildfires today to be properly evaluated.

Although today’s satellite observations of wildfire intensity far outnumber the observations made before the satellite era, there’s still plenty of old data that could be analyzed. Satellites measure what is called the FRP (fire radiative power), which is the total fire radiative energy less the energy dissipated through convection and conduction. The older FI (fire intensity) also measures the energy released by a fire, and is the rate of energy released per unit time per unit length of fire front; FRP, usually measured in MW (megawatts), is obviously related to FI.

The study authors define extreme wildfires as those with daily FRPs exceeding the 99.99th percentile. Satellite FRP data for all fires in the study period was collected in pixels 1 km on a side, each retained pixel containing just one wildfire “hotspot” after duplicate hotspots were excluded.

The total raw dataset included 88.4 million hotspot observations, and this number was reduced to 30.7 million “events” by summing individual pixels in cells approximately 22 x 22 km on a side. Of this 30.7 million, just 2,913 events satisfied the extreme wildfire 99.99th percentile requirement. The average of the study’s summed FRP values for the top 20 events was in the range of 50,000-150,000 MW, corresponding to individual FRPs of about 100-300 MW in a 1 x 1 km pixel.   

A glance at the massive datatset shows individual FRP values ranging from the single digits to several hundred MW. If the 20 hottest wildfires during 2003-23 had FRPs above 100 MW, most of the other 2,893 fires above the 99.99th percentile would have had lower FRPs, in the tens and teens.

While intensity data for historical wildfires is sparse, there are occasionally numbers mentioned in the literature. One example can be found in a 2021 paper that reviews past large-area high-intensity wildfires that have occurred in arid Australian grasslands. The paper’s authors state that:

Contemporary fire cycles in these grasslands (spinifex) are characterized by periodic wildfires that are large in scale, high in intensity (e.g., up to c. 14,000 kW) … and driven by fuel accumulations that occur following exceptionally high rainfall years.

An FRP of 14,000 kW, or 14 MW, is comparable to that of many of the 2,893 FRPs for modern extreme wildfires (excluding the top 20) in the Tasmanian study. The figure below shows the potential fire intensity of bushfires across Australia, the various colors indicating the FI range. As you can see, the most intense bushfires occur in the southeast and southwest of the country; FI values in those regions can exceed 100 MW per meter, which correspond to FRPs of about 30 MW.

And, although it doesn’t cite FI numbers, a 1976 paper on Australian bushfires from 1945 to 1975 makes the statement that:

The fire control authorities recognise that no fire suppression system has been developed in the world which can halt the forward spread of a high-intensity fire burning in continuous heavy fuels under the influence of extreme fire weather.

High- and extremely high-intensity wildfires in Australia at least are nothing new, and the same is no doubt true for other countries included in the Tasmanian study. The study authors remark correctly that higher temperatures due to global warming and the associated drying out of vegetation and forests both increase wildfire intensity. But there have been equally hot and dry periods in the past, such as the 1930s, when larger areas burned.

So there’s nothing remarkable about the present study. Even though it’s difficult to find good wildfire data in the pre-satellite era, the study authors could easily extend their work back to the onset of satellite measurements in the 1970s.

Next: The Scientific Reality of the Quest for Net Zero

Unexpected Sea Level Fluctuations Due to Gravity, New Evidence Shows

Although the average global sea level is rising as the world warms, the rate of rise is far from uniform across the planet and, in some places, is negative – that is, the sea level is falling. Recent research has revealed the role that localized gravity plays in this surprising phenomenon.   

The researchers used gravity-sensing satellites to track how changes in water retention on land can cause unexpected fluctuations in sea levels. While 75% of the extra water in the world’s oceans comes from melting ice sheets and mountain glaciers, they say, the other 25% is due to variations in water storage in ice-free land regions. These include changes in dam water levels, water used in agriculture, and extraction of groundwater which either evaporates or flows into the sea via rivers.

Water is heavy but, the researchers point out, moves easily. Thus local changes in sea level aren't just due to melting ice sheets or glaciers, but also reflect changes in the mass of water on nearby land. For example, the land gets heavier during large floods, which boosts its gravity and causes a temporary rise in local sea level. The opposite occurs during droughts or groundwater extraction, when the land becomes lighter, gravity falls and the local sea level drops.

A similar exchange of water explains why the sea level around Antarctica falls as the massive Antarctic ice sheet melts. The total mass of ice in the sheet is a whopping 24 million gigatonnes (26 million gigatons), enough to exert a significant gravitational pull on the surrounding ocean, making the sea level higher than it would be with no ice sheet. But as the ice sheet melts, this gravitational pull weakens and so the local sea level falls.

At the same time, however, distant sea levels rise in compensation. They also rise continuously over the long term because of the thermal expansion of seawater as it warms; added meltwater from both the Antarctic and Greenland ice sheets; and land subsidence caused by groundwater extraction, resulting from rapid urbanization and population growth. In an earlier post, I discussed how sea levels are affected by land subsidence.

The research also reveals how the pumping of groundwater in ice-free places, such as Mumbai in India and Taipei in Taiwan, can almost mask the sea level rise expected from distant ice sheet melting. Conversely, at Charleston on the U.S. Atlantic coast, where groundwater extraction is minimal, sea level rise appears to be accelerated.

All these and other factors contribute to substantial regional variation in sea levels across the globe. This is depicted in the following figure which shows the average rate of sea level rise, measured by satellite, between 1993 and 2014.

Clearly visible is the falling sea level in the Southern Ocean near Antarctica, as well as elevated rates of rise in the western Pacific and the east coast of North America. Note, however, that the figure is only for the period between 1993 and 2014. Over longer time scales, the global average rate of rise fluctuates considerably, most likely due to the gravitational effects of the giant planets Jupiter and Saturn.

Yet another gravitational influence on sea levels is La Niña, the cool phase of the ENSO (El Niño – Southern Oscillation) ocean cycle. The arrival of La Niña often brings torrential rain and catastrophic flooding to the Pacific northwest of the U.S., northern South America and eastern Australia. As mentioned before, the flooding temporarily enhances the gravitational pull of the land. This raises local sea levels, resulting in a lowering of more distant sea levels – the opposite of the effects from the melting Antarctic ice sheet or from groundwater extraction.

The influence of La Niña is illustrated in the figure below, showing the rate of sea level rise during the two most recent strong La Niñas, in 2010-12 and 2020-23. (Note that the colors in the sea level trend are reversed compared to the previous figure.) A significant local increase in sea level can be seen around both northern South America and eastern Australia, while the global level fell, especially in the 2010-12 La Niña event. Consecutive La Niñas in those years dumped so much rain on land that the average sea level worldwide fell about 5 mm (0.2 inches).

The current rate of sea level rise is estimated at 3.4 mm per year. Of this, the researchers calculate that over-extraction of groundwater alone contributes approximately 1 mm per year – meaning that the true rate of rise, predominantly from ice sheet melting and thermal expansion, is about 2.4 mm per year. Strong La Niñas lower this rate even more temporarily.

But paradoxically, as discussed above, groundwater extraction is causing local sea levels to fall. It’s local sea levels that matter to coastal communities and their engineers and planners.

Next: No Convincing Evidence That Extreme Wildfires Are Increasing

Was the Permian Extinction Caused by Global Warming or CO2 Starvation?

Of all the mass extinctions in the earth’s distant past, by far the greatest and most drastic was the Permian Extinction, which occurred during the Permian between 300 and 250 million years ago. Also known as the Great Dying, the Permian Extinction killed off an estimated 57% of all biological families including rainforest flora, 81% of marine species and 70% of terrestrial vertebrate species that existed before the Permian’s last million years. What was the cause of this devastation?

The answer to that question is controversial among paleontologists. For many years, it has been thought the extinction was a result of ancient global warming. During Earth’s 4.5-billion-year history, the global average temperature has fluctuated wildly, from “hothouse” temperatures as much as 14 degrees Celsius (25 degrees Fahrenheit) above today’s level of about 14.8 degrees Celsius (27 degrees Fahrenheit), to “icehouse” temperatures 6 degrees Celsius (11 degrees Fahrenheit) below.

Hottest of all was a sudden temperature spike from icehouse conditions at the onset of the Permian to extreme hothouse temperatures at its end, as can be seen in the figure below. The figure is a 2021 estimate of ancient temperatures derived from oxygen isotopic measurements combined with lithologic climate indicators, such as coals, sedimentary rocks, minerals and glacial deposits. The barely visible time scale is in millions of years before the present.

The geological event responsible for this enormous surge in temperature is a massive volcanic eruption known as the Siberian Traps. The eruption lasted at least 1 million years and resulted in the outpouring of voluminous quantities of basaltic lava from rifts in West Siberia; the lava buried over 50% of Siberia in a blanket up to 6.5 km (4 miles) deep.

Volcanic CO2 released by the eruptions was supplemented by CO2 produced during combustion of thick, buried coal deposits that lay along the subterranean path of the erupting lava. This stupendous outburst boosted the atmospheric CO2 level from a very low 200 ppm (parts per million) to more than 2,000 ppm, as shown in the next figure.

The conventional wisdom in the past has been that this geologically sudden, gigantic increase in the CO2 level sent the global thermometer soaring – a conclusion sensationalized by mainstream media such as the New York Times. However, that argument ignores the saturation effect for atmospheric CO2, which limits CO2-induced warming to that produced by the first few hundred ppm of the greenhouse gas.

While the composition of the atmosphere 250 million years ago may have been different from today’s, the saturation effect would still have occurred. There’s no question, nevertheless, that end-Permian temperatures were as high as we think, whatever the cause. That’s because the temperatures are based on the highly reliable method of measuring oxygen 18O to 16O isotopic ratios in ancient microfossils.

Such hothouse conditions would have undoubtedly caused the extinction of various species; the severity of the extinction event is revealed by subsequent gaps in the fossil record. Organic carbon accumulated in the deep ocean, depleting oxygen and thus wiping out many marine species such as phytoplankton, brachiopods and reef-building corals. On land, vertebrates such as amphibians and early reptiles, as well as diverse tropical and temperate rainforest flora, disappeared.

All from extreme global warming? Not so fast, says ecologist Jim Steele.

Steele attributes the Permian extinction not to an excess of CO2 at the end of this geological period, but rather to a lack of it during the preceding Carboniferous and the early Permian, as can be seen in the figure above. He explains that all life is dependent on a supply of CO2, and that when its concentration drops below 150 ppm, photosynthesis ceases, and plants and living creatures die.

Steele argues that because of CO2 starvation over this interval, many species had either already become extinct, or were on the verge of extinction, long before the planet heated up so abruptly.

In comparison to other periods, the Permian saw the appearance of very few new species, as illustrated in the following figure. For example, far more new species evolved (and became extinct) during the earlier Ordovician, when CO2 levels were much, much higher but an icehouse climate prevailed.

When CO2 concentrations reached their lowest levels ever in the early Permian, phytoplankton fossils were extremely rare – some 40 million years or so before the later hothouse spike, which is when the conventional narrative claims the species became extinct. And Steele says that 35-47% of marine invertebrate genera went extinct, as well as almost 80% of land vertebrates, from 7 to 17 million years before the mass extinction at the end of the Permian.

Furthermore, Steele adds, the formation of the supercontinent Pangaea (shown to the left), which occurred during the Carboniferous, had a negative effect on biodiversity. Pangea removed unique niches from its converging island-like microcontinents, again long before the end-Permian.

Next: Unexpected Sea Level Fluctuations Due to Gravity, New Evidence Shows

El Niño and La Niña May Have Their Origins on the Sea Floor

One of the least understood aspects of our climate is the ENSO (El Niño – Southern Oscillation) ocean cycle, whose familiar El Niño (warm) and La Niña (cool) events cause drastic fluctuations in global temperature, along with often catastrophic weather in tropical regions of the Pacific and delayed effects elsewhere. A recent research paper attributes the phenomenon to tectonic and seismic activity under the oceans.

Principal author Valentina Zharkova, formerly at the UK’s Northumbria University, is a prolific researcher into natural sources of global warming, such as the sun’s internal magnetic field and the effect of solar activity on the earth’s ozone layer. Most of her studies involve sophisticated mathematical analysis and her latest paper is no exception.

Zharkova and her coauthor Irina Vasilieva make use of a technique known as wavelet analysis, combined with correlation analysis, to identify key time periods in the ONI (Oceanic Niño Index). The index, which measures the strength of El Niño and La Niña events, is the 3-monthly average difference from the long-term average sea surface temperature in the ENSO region of the tropical Pacific. Shown in the figure below are values of the index from 1950 to 2016.

Wavelet analysis supplies information both on which frequencies are present in a time series signal, and on when those frequencies occur, unlike a Fourier transform which decomposes a signal only into its frequency components.

Using the wavelet approach, Zharkova and Vasilieva have identified two separate ENSO cycles: one with a shorter period of 4-5 years, and a longer one with a period of 12 years. This is illustrated in the next figure which shows the ONI at top left; the wavelet spectrum of the index at bottom left, with the wavelet “power” indicated by the colored bar at top right; and the global wavelet spectrum at bottom right. 

The authors link the 4- to 5-year ENSO cycle to the motion of tectonic plates, a connection that has been made by other researchers. The 12-year ENSO cycle identified by their wavelet analysis they attribute to underwater volcanic activity; it does not correspond to any solar cycle or other known natural source of warming.

The following figure depicts an index (in red, right-hand scale), calculated by the authors, that measures the total annual volcanic strength and duration of all submarine volcanic eruptions from 1950 to 2023, superimposed on the ONI (in black) over the same period. A weak correlation can be seen between the ENSO ONI and undersea volcanic activity, the correlation being strongest at 12-year intervals.

Zharkova and Vasilieva estimate the 12-year ENSO correlation coefficient at 25%, a connection they label as “rather significant.” As I discussed in a recent post, retired physical geographer Arthur Viterito has proposed that submarine volcanic activity is the principal driver of global warming, via a strengthening of the thermohaline circulation that redistributes seawater and heat around the globe.

Zharkova and Vasilieva, however, link the volcanic eruptions causing the 12-year boost in the ENSO index to tidal gravitational forces on the earth from the giant planet Jupiter and from the sun. Jupiter of course orbits the sun and spins on an axis, just like Earth. But the sun is not motionless either: it too rotates on an axis and, because it’s tugged by the gravitational pull of the Jupiter and Saturn giants, orbits in a small but complex spiral around the center of the solar system.

Jupiter was selected by the researchers because its orbital period is 12 years - the same as the longer ENSO cycle identified by their wavelet analysis.

That Jupiter’s gravitational pull on Earth influences volcanic activity is clear from the next figure, in which the frequency of all terrestrial volcanic eruptions (underwater and surface) is plotted against the distance of Earth from Jupiter; the distance is measured in AU (astronomical units), where 1 AU is the average earth-sun distance. The thick blue line is for all eruptions, while the thick yellow line shows the eruption frequency in just the ENSO region.

What stands out is the increased volcanic frequency when Jupiter is at one of two different distances from Earth: 4.5 AU and 6 AU. The distance of 4.5 AU is Jupiter’s closest approach to Earth, while 6 AU is Jupiter’s distance when the sun is closest to Earth and located between Earth and Jupiter. The correlation coefficient between the 12-year ENSO cycle and the Earth-Jupiter distance is 12%.  

For the gravitational pull of the sun, Zharkova and Vasilieva find there is a 15% correlation between the 12-year ENSO cycle and the Earth-sun distance in January, when Earth’s southern hemisphere (where ENSO occurs) is closest to the sun. Although these solar system correlations are weak, Zharkova and Vasilieva say they are high considering the vast distances involved.

Next: Shrinking Cloud Cover: Cause or Effect of Global Warming?

The Deceptive Catastrophizing of Weather Extremes: (2) Economics and Politics

In my previous post, I reviewed the science described in environmentalist Ted Nordhaus’ four-part essay, “Did Exxon Make It Rain Today?”, and how science is being misused to falsely link weather extremes to climate change. Nordhaus also describes how the perception of a looming climate catastrophe, exemplified by extreme weather events, is being fanned by misconceptions about the economic costs of natural disasters and by environmental politics – both the subject of this second post.

Between 1990 and 2017, the global cost of weather-related disasters increased by 74%, according to an analysis by Roger Pielke, Jr., a former professor at the University of Colorado. Economic loss studies of natural disasters have been quick to blame human-caused climate change for this increase.

But Nordhaus makes the point that, if the cost of natural disasters is increasing due to global warming, then you would expect the cost of weather-related disasters to be rising faster than that of disasters not related to weather. Yet the opposite is true. States Nordhaus: “The cost of disasters unrelated [my italics] to weather increased 182% between 1990 and 2017, more than twice as fast as for weather-related disasters.” This is evident in the figure below, which shows both costs from 1990 to 2018.

Nordhaus goes on to declare:

In truth, it is economic growth, not climate change, that is driving the boom in economic damage from both weather-related and non-weather-related natural disasters.

Once the losses are corrected for population gain and the ever-escalating value of property in harm’s way, there is very little evidence to support any connection between natural dis­asters and global warming. Nordhaus explains that accelerating urbanization since 1950 has led to an enormous shift of the global population, economic activity, and wealth into river and coastal floodplains.

On the influence of environmental politics in connecting weather extremes to global warming, Nordhaus has this to say:

… the perception among many audiences that these events centrally implicate anthropogenic warming has been driven by ... a sustained campaign by environmental advocates to move the proximity of climate catastrophe in the public imagination from the uncertain future into the present.

The campaign had its origins in a 2012 meeting of environmental advocates, litigators, climate scientists and others in La Jolla, California, convened by the Union of Concerned Scientists. The specific purpose of the gathering was “to develop a public narrative connecting extreme weather events that were already happening, and the damages they were causing, with climate change and the fossil fuel industry.”

This was clearly an attempt to mimic the 1960s campaign against smoking tobacco because of its link to lung cancer. However, the correlation between smoking and lung cancer is extraordinarily high, leaving no doubt about causation. The same cannot be said for any connection between extreme weather events and climate change.

Nevertheless, it was at the La Jolla meeting that the idea of reframing the attribution of extreme weather to climate change, as I discussed in my previous post, was born. Nordhaus discerns that a subsequent flurry of attribution reports, together with a fortuitous restructuring of the media at the same time:

… have given journalists license to ignore the enormous body of research and evidence on the long-term drivers of natural disasters and the impact that climate change has had on them.

It was but a short journey from there for the media to promote the notion, favored by “much of the environmental cognoscenti” as Nordhaus puts it, that “a climate catastrophe is now unfolding, and that it is demonstrable in every extreme weather event.”

The media have undergone a painful transformation in the last few decades, with the proliferation of cable news networks followed by the arrival of the Internet. The much broader marketplace has resulted in media outlets tailoring their content to the political values and ideological preferences of their audiences. This means, says Nordhaus, that sensationalism such as catastrophic climate news – especially news linking extreme weather to anthropogenic warming – plays a much larger role than before.

As I discussed in a 2023 post, the ever increasing hype in nearly all mainstream media coverage of weather extremes is a direct result of advocacy by well-heeled benefactors like the Rockefeller, Walton and Ford foundations. The Rockefeller Foundation, for example, has begun funding the hiring of climate reporters to “fight the climate crisis.”

A new coalition, founded in 2019, of more than 500 media outlets is dedicated to producing “more informed and urgent climate stories.” The CCN (Covering Climate Now) coalition includes three of the world’s largest news agencies — Reuters, Bloomberg and Agence France Presse – and claims to reach an audience of two billion.

Concludes Nordhaus:

[These new dynamics] are self-reinforcing and have led to the widespread perception among elite audiences that the climate is spinning out of control. New digital technology bombards us with spectacular footage of extreme weather events. … Catastrophist climate coverage generates clicks from elite audiences.

Next: El Niño and La Niña May Have Their Origins on the Sea Floor

The Deceptive Catastrophizing of Weather Extremes: (1) The Science

In these pages, I’ve written extensively about the lack of scientific evidence for any increase in extreme weather due to global warming. But I’ve said relatively little about the media’s exploitation of the mistaken belief that weather extremes are worsening be­cause of climate change.

A recent four-part essay addresses the latter issue, under the title “Did Exxon Make It Rain Today?”  The essay was penned by Ted Nordhaus, well-known environmentalist and director of the Breakthrough Institute in Berkeley, California, which he co-founded with Michael Shellenberger in 2007. Its authorship was a surprise to me, since the Breakthrough Institute generally supports the narrative of largely human-caused warming.

Nonetheless, Nordhaus’s thoughtful essay takes a mostly skeptical – and realistic – view of hype about weather extremes, stating that:

We know that anthropogenic warming can increase rainfall and storm surges from a hurricane, or make a heat wave hotter. But there is little evidence that warming could create a major storm, flood, drought, or heat wave where otherwise none would have occurred, …

Nordhaus goes on to make the insightful statement that “The main effect that climate change has on extreme weather and natural disasters … is at the margins.” By this, he means that a heat wave in which daily high temperatures for, say, a week reached 37 degrees Celsius (99 degrees Fahrenheit) or above in the absence of climate change would instead stay above perhaps 39 degrees Celsius (102 degrees Fahrenheit) with our present level of global warming.

His assertion is illustrated in the following, rather congested figure from the IPCC (Intergovernmental Panel on Climate Change)’s Sixth Assessment Report. The purple curve shows the average annual hottest daily maximum temperature on land, while the green and black curves indicate the land and global average annual mean temperature, respectively; temperatures are measured relative to their 1850–1900 means.

However, while global warming is making heat waves marginally hotter, Nordhaus says there is no evidence that extreme weather events are on the rise, as so frequently trumpeted by the mainstream media. Although climate change will make some weather events such as heavy rainfall more intense than they otherwise would be, the global area burned by wildfires has actually decreased and there has been no detectable global trend in river floods, nor meteorological drought, nor hurricanes.

Adds Nordhaus:

The main source of climate variability in the past, present, and future, in all places and with regard to virtually all climatic phenomena, is still overwhelmingly non-human: all the random oscillations in climatic extremes that occur in a highly complex climate system across all those highly diverse geographies and topographies.

The misconception that weather extremes are increasing when they are not has been amplified by attribution studies, which use a new statistical method and climate models to assign specific extremes to either natural variabil­ity or human causes. Such studies involve highly questionable methodology that has several shortcomings.

Even so, the media and some climate scientists have taken scientifically unjustifiable liberties with attribution analysis in order to link extreme events to climate change – such as attempting to quantify how much more likely global warming made the occurrence of a heat wave that resulted in high temperatures above 38 degrees Celsius (100 degrees Fahrenheit) for a period of five days in a specific location.

But, explains Nordhaus, that is not what an attribution study actually estimates. Rather, “it quantifies changes in the likelihood of the heat wave reaching the precise level of extremity that occurred.” In the hypothetical case above, the heat wave would have happened anyway in the absence of climate change, but it would have resulted in high temperatures above 37 degrees Celsius (99 degrees Fahrenheit) over five days instead of above 38 degrees.

The attribution method estimates the probability of a heat wave or other extreme event occurring that is incrementally hotter or more severe than the one that would have occurred without climate change, not the probability of the heat wave or other event occurring at all.

Nonetheless, as we’ll see in the next post, the company WWA (World Weather Attribution), founded by German climatologist Friederike Otto, has utilized this new technology to rapidly produce science that does connect weather extremes to climate change – with the explicit goal of shaping news coverage. Coverage of climate-related disasters now routinely features WWA analysis, which is often employed to suggest that climate change is the cause of such events.

Next: The Deceptive Catastrophizing of Weather Extremes: (2) Economics and Politics

Extreme Weather in the Distant Past Was Just as Frequent and Intense as Today’s

In a recent series of blog posts, I showed how actual scientific data and reports in newspaper archives over the past century demonstrate clearly that the frequency and severity of extreme weather events have not increased during the last 100 years. But there’s also plenty of evidence of weather extremes comparable to today’s dating back centuries and even millennia.

The evidence consists largely of reconstructions based on proxies such as tree rings, sediment cores and leaf fossils, although some evidence is anecdotal. Reconstruction of historical hurricane patterns, for example, confirms what I noted in an earlier post, that past hurricanes were even more frequent and stronger than those today.

The figure below shows a proxy measurement for hurricane strength of landfalling tropical cyclones – the name for hurricanes down under – that struck the Chillagoe limestone region in northeastern Queensland, Australia between 1228 and 2003. The proxy was the ratio of 18O to 16O isotopic levels in carbonate cave stalagmites, a ratio which is highly depleted in tropical cyclone rain.

What is plotted here is the 18O/16O depletion curve, in parts per thousand (‰); the thick horizontal line at -2.50 ‰ denotes Category 3 or above events, which have a top wind speed of 178 km per hour (111 mph) or greater. It’s clear that far more (seven) major tropical cyclones impacted the Chillagoe region in the period from 1600 to 1800 than in any period since, at least until 2003. Indeed, the strongest cyclone in the whole record occurred during the 1600 to 1800 period, and only one major cyclone was recorded from 1800 to 2003.

Another reconstruction of past data is that of unprecedently long and devastating “megadroughts,” which have occurred in western North America and in Europe for thousands of years. The next figure depicts a reconstruction from tree ring proxies of the drought pattern in central Europe from 1000 to 2012, with observational data from 1901 to 2018 superimposed. Dryness is denoted by negative values, wetness by positive values.

The authors of the reconstruction point out that the droughts from 1400 to 1480 and from 1770 to 1840 were much longer and more severe than those of the 21st century. A reconstruction of megadroughts in California back to 800 was featured in a previous post.

An ancient example of a megadrought is the 7-year drought in Egypt approximately 4,700 years ago that resulted in widespread famine, known as Famine Stela. The water level in the Nile River dropped so low that the river failed to flood adjacent farmlands as it normally does each year, resulting in drastically reduced crop yields. The event is recorded in a hieroglyphic inscription on a granite block located on an island in the Nile.

At the other end of the wetness scale, a Christmas Eve flood in the Netherlands, Denmark and Germany in 1717 drowned over 13,000 people – many more than died in the much hyped Pakistan floods of 2022.

Although most tornadoes occur in the U.S., they have been documented in the UK and other countries for centuries. In 1577, North Yorkshire in England experienced a tornado of intensity T6 on the TORRO scale, which corresponds approximately to EF4 on the Fujita scale, with wind speeds of 259-299 km per hour (161-186 mph). The tornado destroyed cottages, trees, barns, hayricks and most of a church. EF4 tornadoes are relatively rare in the U.S.: of 1,000 recorded tornadoes from 1950 to 1953, just 46 were EF4.

Violent thunderstorms that spawn tornadoes have also been reported throughout history. An associated hailstorm which struck the Dutch town of Dordrecht in 1552 was so violent that residents “thought the Day of Judgement was coming” when hailstones weighing up to a few pounds fell on the town. A medieval depiction of the event is shown in the following figure.

Such historical storms make a mockery of the 2023 claim by a climate reporter that “Recent violent storms in Italy appear to be unprecedented for intensity, geographical extensions and damages to the community.” The thunderstorms in question produced hailstones the size of tennis balls, merely comparable to those that fell on Dordrecht centuries earlier. And the storms hardly compare with a hailstorm in India in 1888, which actually killed 246 people.

Next: Challenges to the CO2 Global Warming Hypothesis: (10) Global Warming Comes from Water Vapor, Not CO2

Antarctica Sending Mixed Climate Messages

Antarctica, the earth’s coldest and least-populated continent, is an enigma when it comes to global warming.

While the huge Antarctic ice sheet is known to be shedding ice around its edges, it may be growing in East Antarctica. Antarctic sea ice, after expanding slightly for at least 37 years, took a tumble in 2017 and reached a record low in 2023. And recent Antarctic temperatures have swung from record highs to record lows. No one is sure what’s going on.

The influence of global warming on Antarctica’s temperatures is uncertain. A 2021 study concluded that both East Antarctica and West Antarctica have cooled since the beginning of the satellite era in 1979, at rates of 0.70 degrees Celsius (1.3 degrees Fahrenheit) per decade and 0.42 degrees Celsius (0.76 degrees Fahrenheit) per decade, respectively. But over the same period, the Antarctic Peninsula (on the left in the adjacent figure) has warmed at a rate of 0.18 degrees Celsius (0.32 degrees Fahrenheit) per decade.

During the southern summer, two locations in East Antarctica recorded record low temperatures early this year. At the Concordia weather station, located at the 4 o’clock position from the South Pole, the mercury dropped to -51.2 degrees Celsius (-60.2 degrees Fahrenheit) on January 31, 2023. This marked the lowest January temperature recorded anywhere in Antarctica since the first meteorological observations there in 1956.

Two days earlier on January 29, 2023, the nearby Vostok station, about 400 km (250) miles closer to the South Pole, registered a low temperature of -48.7 degrees Celsius (-55.7 degrees Fahrenheit), that location’s lowest January temperature since 1957. Vostok has the distinction of reporting the lowest temperature ever recorded in Antarctica, and also the world record low, of -89.2 degrees Celsius (-128.6 degrees Fahrenheit) on July 21, 1984.

Barely a year before, however, East Antarctica had experienced a heat wave, when the temperature soared to -10.1 degrees Celsius (13.8 degrees Fahrenheit) at the Concordia station on March 18, 2022. This balmy reading was the highest recorded hourly temperature at that weather station since its establishment in 1996, and 20 degrees Celsius (36 degrees Fahrenheit) above the previous March record high there. Remarkably, the temperature remained above the previous March record for three consecutive days, including nighttime.

Antarctic sea ice largely disappears during the southern summer and reaches its maximum extent in September, at the end of winter. The two figures below illustrate the winter maximum extent in 2023 (left) and the monthly variation of Antarctic sea ice extent this year from its March minimum to the September maximum (right).

The black curve on the right depicts the median extent from 1981 to 2010, while the dashed red and blue curves represent 2022 and 2023, respectively. It's clear that Antarctic sea ice in 2023 has lagged the median and even 2022 by a wide margin throughout the year. The decline in summer sea ice extent has now persisted for six years, as seen in the following figure which shows the average monthly extent since satellite measurements began, as an anomaly from the median value.

The overall trend from 1979 to 2023 is an insignificant 0.1% per decade relative to the 1981 to 2010 median. Yet a prolonged  increase above the median occurred from 2008 to 2017, followed by the six-year decline since then. The current downward trend has sparked much debate and several possible reasons have been put forward, not all of which are linked to global warming. One analysis attributes the big losses of sea ice in 2017 and 2023 to extra strong El Niños.

Melting of the Antarctic ice sheet is currently causing sea levels to rise by 0.4 mm (16 thousandths of an inch) per year, contributing about 10% of the global total. But the ice loss is not uniform across the continent, as seen in the next figure showing changes in Antarctic ice sheet mass since 2002.

In the image on the right, light blue shades indicate ice gain while orange and red shades indicate ice loss. White denotes areas where there has been very little or no change in ice mass since 2002; gray areas are floating ice shelves whose mass change is not measured by this satellite method.

You can see that East Antarctica has experienced modest amounts of ice gain, which is due to warming-enhanced snowfall. Nevertheless, this gain has been offset by significant loss of ice in West Antarctica over the same period, largely from melting of glaciers – which is partly caused by active volcanoes underneath the continent. While the ice sheet mass declined at a fairly constant rate of 133 gigatonnes (147 gigatons) per year from 2002 to 2020, it appears that the total mass may have reached a minimum and is now on the rise again.

Despite the hullabaloo about its melting ice sheet and shrinking sea ice, what happens next in Antarctica continues to be a scientific mystery.

Next: Two Statistical Studies Attempt to Cast Doubt on the CO2 Narrative

No Evidence That Today’s El Niños Are Any Stronger than in the Past

The current exceptionally strong El Niño has revived discussion of a question which comes up whenever the phenomenon recurs every two to seven years: are stronger El Niños caused by global warming? While recent El Niño events suggest that in fact they are, a look at the historical record shows that even stronger El Niños occurred in the distant past.

El Niño is the warm phase of ENSO (the El Niño – Southern Oscillation), a natural ocean cycle that causes drastic temperature fluctuations and other climatic effects in tropical regions of the Pacific. Its effect on atmospheric temperatures is illustrated in the figure below. Warm spikes such as those in 1997-98, 2009-10, 2014-16 and 2023 are due to El Niño; cool spikes like those in 1999-2001 and 2008-09 are due to the cooler La Niña phase.

A slightly different temperature record, of selected sea surface temperatures in the El Niño region of the Pacific, averaged yearly from 1901 to 2017, is shown in the next figure from a 2019 study.

Here the baseline is the mean sea surface temperature over the 1901-2017 interval, and the black dashed line at 0.6 degrees Celsius is defined by the study authors as the threshhold for an El Niño event. The different colors represent various regional types of El Niño; the gray bars mark warm years in which no El Niño developed.

This year’s gigantic spike in the tropospheric temperature to 0.93 degrees Celsius (1.6 degrees Fahrenheit) – a level that set alarm bells ringing – is clearly the strongest El Niño by far in the satellite record. Comparison of the above two figures shows that it is also the strongest since 1901. So it does indeed appear that El Niños are becoming stronger as the globe warms, especially since 1960.

Nevertheless, such a conclusion is ill-considered as there is evidence from an earlier study that strong El Niños have been plentiful in the earth’s past.

As I described in a previous post, a team of German paleontologists established a complete record of El Niño events going back 20,000 years, by examining marine sediment cores drilled off the coast of Peru. The cores contain an El Niño signature in the form of tiny, fine-grained stone fragments, washed into the sea by multiple Peruvian rivers following floods in the country caused by heavy El Niño rainfall.

The research team classified the flood signal as very strong when the concentration of stone fragments, known as lithics, was more than two standard deviations above the centennial mean. The frequency of these very strong events over the last 12,000 years is illustrated in the next figure; the black and gray bars show the frequency as the number of 500- and 1,000-year floods, respectively. Radiocarbon dating of the sediment cores was used to establish the timeline.

A more detailed record is presented in the following figure, showing the variation over 20,000 years of the sea surface temperature off Peru (top), the lithic concentration (bottom) and a proxy for lithic concentration (center). Sea surface temperatures were derived from chemical analysis of the marine sediment cores.

You can see that the lithic concentration and therefore El Niño strength were high around 2,000 and 10,000 years ago – approximately the same periods when the most devastating floods occurred. The figure also reveals the absence of strong El Niño activity from 5,500 to 7,500 years ago, a dry interval without any major Peruvian floods as reflected in the previous figure.

If you examine the lithic plots carefully, you can also see that the many strong El Niños approximately 2,000 and 10,000 years ago were several times stronger (note the logarithmic concentration scale) than current El Niños on the far left of the figure. Those two periods were warmer than today as well, being the Roman Warm Period and the Holocene Thermal Maximum, respectively.

So there is nothing remarkable about recent strong El Niños.

Despite this, the climate science community is still uncertain about the global warming question. The 2019 study described above found that since the 1970s, formation of El Niños has shifted from the eastern to the western Pacific, where ocean temperatures are higher. From this observation, the study authors concluded that future El Niños may intensify. However, they qualified their conclusion by stating that:

… the root causes of the observed background changes in the later part of the 20th century remain elusive … Natural variability may have added significant contributions to the recent warming.

Recently, an international team of 17 scientists has conducted a theoretical study of El Niños since 1901 using 43 climate models, most of which showed the same increase in El Niño strength since 1960 as the actual observations. But again, the researchers were unable to link this increase to global warming, declaring that:

Whether such changes are linked to anthropogenic warming, however, is largely unknown.

The researchers say that resolution of the question requires improved climate models and a better understanding of El Niño itself. Some climate models show El Niño becoming weaker in the future.

Next: Antarctica Sending Mixed Climate Messages

Estimates of Economic Losses from El Niños Are Far-fetched

A recent study makes the provocative claim that some of the most intense past El Niño events cost the global economy from $4 trillion to $6 trillion over the following years. That’s two orders of magnitude higher than previous estimates, but almost certainly wrong.

One reason for the enormous difference is that earlier estimates only examined the immediate economic toll, whereas the new study estimated cumulative losses over the five-year period after a warming El Niño. The study authors say, correctly, that the economic downturn triggered by this naturally occurring climate cycle can last that long.

However, even when this drawn-out effect is taken into account, the new study’s cost estimates are still one order of magnitude greater than other estimates in the scientific literature, such as those of the University of Colorado’s Roger Pielke Jr., who studies natural disasters. His estimated time series of total weather disaster losses as a proportion of global GDP from 1990 to 2020 is shown in the figure below.

The accounting used in the new study includes the “spatiotemporal heterogeneity of El Niño teleconnections,” teleconnections being links between weather phenomena at widely separated locations. Country-level teleconnections are based on correlations between temperature or rainfall in that country, and indexes commonly used to define El Niño and its cooling counterpart, La Niña. Teleconnections are strongest in the tropics and weaker in midlatitudes.

The researchers’ accounting procedure estimates total losses from the 1997-98 El Niño at a staggering $5.7 trillion by 2003, compared with a previous estimate of only $36 billion in the immediate aftermath of the event. For the earlier 1982-83 El Niño, the study estimates the total costs at $4.1 trillion by 1988. The calculated global distribution of GDP losses following both events is illustrated in the next figure.

To see how implausible these trillion-dollar estimates are, it’s only necessary to refer to Pielke’s graph above, which relies on official data from the insurance industry (including leading reinsurance company Munich Re) and the World Bank. His graph indicates that the peak loss from all 1998 weather disasters was 0.38% of global GDP for that year.

As El Niño was not the only disaster in 1998 – others include floods and hurricanes – this number represents an upper limit for instant El Niño losses. Using a value for global GDP in 1998 of $31,533 billion in current U.S. dollars, 0.38% was a maximum instant loss of $120 billion. Over a subsequent 5-year period, the maximum loss would have been 5 times as much, or $600 billion assuming the same annual loss each year which is undoubtedly an overestimate.

This inflated estimate of $600 billion is still an order of magnitude smaller than the study’s $5.7 trillion by 2003. In reality, the discrepancy is larger yet because the actual 5-year loss was likely much less than $600 billion as just discussed.

Two other observations about Pielke’s graph cast further doubt on the methodology of the researchers’ accounting procedure. First, the strongest El Niños in that 21-year period were those in 1997-98, 2009-10 and 2014-16. The graph does indeed show peaks in 1998-99 and in 2017, one year after a substantial El Niño – but not in 2011 following the 2009-10 event. This alone suggests that financial losses from El Niño are not as large as the researchers think.

Furthermore, there’s a strong peak in 2005, the largest in the 21 years of the graph, which doesn’t correspond to any substantial El Niño. The implication is that losses from other types of weather disaster can dominate losses from El Niño.

It’s important to get an accurate handle on economic losses from El Niño and other weather disasters, in case global warming exacerbates such events in the future – although, as I’ve written extensively, there’s no evidence to date that this is happening yet. Effects of El Niño include catastrophic flooding in the western Americas, flooding or episodic droughts in Australia, and coral bleaching.

The study authors stand by their research, however, estimating that the 2023 El Niño could hold back the global economy by $3 trillion over the next five years, a figure not included in their paper. But others are more skeptical. Climate economist Gary Yohe commented that “the enormous estimates cannot be explained simply by forward-looking accounting.” And Mike McPhaden, a senior scientist at NOAA (the U.S. National Oceanic and Atmospheric Administration) who was not involved in the research, called the study “provocative.”

Next: Targeting Farmers for Livestock Greenhouse Gas Emissions Is Misguided

Has the Mainstream Media Suddenly Become Honest in Climate Reporting?

Not so long ago I excoriated the mainstream media for misleading the public about perfectly normal extreme weather events. So ABC News’ August 14 article headlined “Why climate change can't be blamed for the Maui wildfires” came as a shock, a seeming media epiphany on the lack of connection between extreme weather and climate change.

But my amazement was short-lived. The next day the news network succumbed to a social media pressure campaign by climate activists, who persuaded ABC News to water down their headline by adding the word “entirely” after “blamed.” Back to the false narrative that today’s weather extremes are more common and more intense because of climate change.

Nevertheless, a majority of the scientific community, including many meteorologists and climate scientists, think that climate change was only a minor factor in kindling the deadly, tragic conflagration on Maui.

As ecologist Jim Steele has explained, the primary cause of the Maui disaster was dead grasses – invasive, nonnative species such as Guinea grass that have flourished in former Maui farmland and forest areas since pineapple and sugar cane plantations were abandoned in the 1980s. Following a wet spring this year which caused prolific grass growth, the superabundance of these grasses quickly became highly flammable in the ensuing dry season. The resulting tinderbox merely awaited a spark.

Three paragraphs later, the story quotes UCLA (University of California, Los Angeles) climate scientist Daniel Swain as saying:

We should not look to the Maui wildfires as a poster child of the link to climate change.

Swain’s statement was immediately followed by another from Abby Frazier, a climatologist at Clark University in Worcester, Massachusetts, wThat spark came from the failure of Maui’s electrical utility to shut off power in the face of hurricane-force winds. Numerous instances of blazes triggered by live wires falling on dessicated vegetation or by malfunctioning electrical equipment have been reported. Just hours before the city of Lahaina was devastated by the fires, a power line was actually seen shedding sparks and igniting dry grass.

Exactly the same conditions set off the calamitous Camp Fire in California in 2018, which was ignited by a faulty electric transmission line in high winds, and demolished Paradise and several other towns. While the Camp Fire’s fuel included parched trees as well as dry grasses, it was almost as deadly as the 2023 Maui fires, killing 86 people. The utility company PG&E (Pacific Gas and Electric Company) admitted responsibility, and was forced to file for bankruptcy in 2019 because of potential lawsuits.

Despite the editorial softening of ABC News’ headline on the Maui wildfires, however, the article itself still contains a number of statements more honest than most penned by run-of-the-mill climate journalists. Four paragraphs into the story, this very surprising sentence appears:

Not only do “fire hurricanes” not exist, but climate change can't be blamed for the number of people who died in the wildfires.

The term “fire hurricanes” refers to a term used erroneously by Hawaii’s governor when commenting on the fires.  ho commented that:

The main factor driving the fires involved the invasive grasses that cover huge parts of Hawaii, which are extremely flammable.

And there was more. All of which is unprecedented, to borrow a favorite word of climate alarmists, in climate reporting of the last few years that has routinely promoted the mistaken belief that weather extremes are worsening be­cause of climate change.

Is this the beginning of a new trend, or just an isolated exception?

Time will tell, but there are subtle signs that other mainstream newspapers and TV networks may be cutting back on their usual hysterical hype about extreme weather. One of the reasons could be the IPCC (Intergovernmental Panel on Climate Change) new Chair’s urging the IPCC to “stick to our fundamental values of following science and trying to avoid any siren voices that take us towards advocacy.” There are already a handful of media that endeavor to be honest and truly fact-based in their climate reporting, including the Washington Examiner and The Australian.

Opposing any move in this direction is a new coalition, founded in 2019, of more than 500 media outlets dedicated to producing “more informed and urgent climate stories.” The CCN (Covering Climate Now) coalition includes three of the world’s largest news agencies — Reuters, Bloomberg and Agence France Presse – and claims to reach an audience of two billion.

In addition to efforts of the CCN, the Rockefeller Foundation has begun funding the hiring of climate reporters to “fight the climate crisis.” Major beneficiaries of this program include the AP (Associated Press) and NPR (National Public Radio).

Leaving no doubts about the advocacy of the CCN agenda, its website mentions the activist term “climate emergency” multiple times and includes a page setting out:

Tips and examples to help journalists make the connection between extreme weather and climate change.

Interestingly enough, ABC News became a CCN member in 2021 – but has apparently had a change of heart since, judging from its Maui article.

Next: The Sun Can Explain 70% or More of Global Warming, Says New Study

Record Heat May Be from Natural Sources: El Niño and Water Vapor from 2022 Tonga Eruption

The record heat worldwide over the last few months – simultaneous heat waves in both the Northern and Southern Hemispheres, and abnormally warm oceans – has led to the hysterical declaration of “global boiling” by the UN Secretary General, the media and even some climate scientists. But a rational look at the data reveals that the cause may be natural sources, not human CO2.

The primary source is undoubtedly the warming El Niño ocean cycle, a natural event that recurs at irregular intervals from two to seven years. The last strong El Niño, which temporarily raised global temperatures by about 0.14 degrees Celsius (0.25 degrees Fahrenheit), was in 2016. For comparison, it takes a full decade for current global warming to increase temperatures by that much. 

However, on top of the 2023 El Niño has been an unexpected natural source of warming – water vapor in the upper atmosphere, resulting from a massive underwater volcanic eruption in the South Pacific kingdom of Tonga in January 2022.

Normally, erupting volcanoes cause significant global cooling, from shielding of sunlight by sulfate aerosol particles in the eruption plume that linger in the atmosphere. Following the 1991 eruption of Mount Pinatubo in the Philippines, for example, the global average temperature fell by 0.6 degrees Celsius (1.1 degrees Fahrenheit) for more than a year.

But the eruption of the Hunga Tonga–Hunga Haʻapai volcano did more than just launch a destructive tsunami and shoot a plume of ash, gas, and pulverized rock 55 kilometers (34 miles) into the sky. It also injected 146 megatonnes (161 megatons) of water vapor into the stratosphere (the layer of the atmosphere above the troposphere) like a geyser. Because it occurred only about 150 meters (500 feet) underwater, the eruption immediately superheated the shallow seawater above and converted it explosively into steam.

Although the excess water vapor – enough to fill more than 58,000 Olympic-size swimming pools – was originally localized to the South Pacific, it quickly diffused over the whole globe. According to a recent study by a group of atmospheric physicists at the University of Oxford and elsewhere, the eruption boosted the water vapor content of the stratosphere worldwide by as much as 10% to 15%. 

Water vapor is a powerful greenhouse gas, the dominant greenhouse gas in the atmosphere in fact; it is responsible for about 70% of the earth’s natural greenhouse effect, which keeps the planet at a comfortable enough temperature for living organisms to survive, rather than 33 degrees Celsius (59 degrees Fahrenheit) cooler. So even 10–15% extra water vapor in the stratosphere makes the earth warmer.

The study authors estimated the additional warming from the Hunga Tonga eruption using a simple climate model combined with a widely available radiative transfer model. Their estimate was a maximum global warming of 0.035 degrees Celsius (0.063 degrees Fahrenheit) in the year following the eruption, diminishing over the next five years. The cooling effect of the small amount of sulfur dioxide (SO2) from the eruption was found to be minimal.

As I explained in an earlier post, any increase in ocean surface temperatures from the Hunga Tonga eruption would have been imperceptible, at a minuscule 14 billionths of a degree Celsius or less. That’s because the oceans, which cover 71% of the earth’s surface, are vast and can hold 1,000 times more heat than the atmosphere. Undersea volcanic eruptions can, however, cause localized marine heat waves, as I discussed in another post.

Although 0.035 degrees Celsius (0.063 degrees Fahrenheit) of warming from the Hunga Tonga eruption pales in comparison with 2016’s El Niño boost of 0.14 degrees Celsius (0.25 degrees Fahrenheit), it’s nevertheless more than double the average yearly increase of 0.014 degrees Celsius (0.025 degrees Fahrenheit) of global warming from other sources such as greenhouse gases.

El Niño is the warm phase of ENSO (the El Niño – Southern Oscillation), a natural cycle that causes drastic temperature fluctuations and other climatic effects in tropical regions of the Pacific, as well as raising temperatures globally. Its effect on sea surface temperatures in the central Pacific is illustrated in the figure below. It can be seen that the strongest El Niños, such as those in 1998 and 2016, can make Pacific surface waters more than 2 degrees Celsius (3.6 degrees Fahrenheit) hotter for a whole year or so. 

Exactly how strong the present El Niño will be is unknown, but the heat waves of July suggest that this El Niño – augmented by the Hunga Tonga water vapor warming – may be super-strong. Satellite measurements showed that, in July 2023 alone, the temperature of the lower troposphere rose from 0.38 degrees Celsius (0.68 degrees Fahrenheit) to 0.64 degrees Celsius (1.2 degrees Fahrenheit) above the 1991-2020 mean.

If this El Niño turns out to be no stronger than in the past, then the source of the current “boiling” heat will remain a mystery. Perhaps the Hunga Tonga water vapor warming is larger than the Oxford group estimates. The source certainly isn’t any warming from human CO2, which raises global temperatures gradually and not abruptly as we’ve seen in 2023.

Next: Has the Mainstream Media Suddenly Become Honest in Climate Reporting?

No Evidence That Extreme Weather on the Rise: A Look at the Past - (6) Wildfires

This post on wildfires completes the present series on the history of weather extremes. The mistaken belief that weather extremes are intensifying be­cause of climate change has only been magnified by the smoke recently wafting over the U.S. from Canadian wildfires, if you believe the apocalyptic proclamations of Prime Minister Trudeau, President Biden and the Mayor of New York.

But, just as with all the other examples of extreme weather presented in this series, there’s no scientific evidence that wildfires today are any more frequent or severe than anything experienced in the past. Although wildfires can be exacerbated by other weather extremes such as heat waves and drought, we’ve already seen that those extremes are not on the rise either.

Together with tornadoes, wildfires are probably the most fearsome of the weather extremes commonly blamed on global warming. Both can arrive with little or no warning, making it difficult or impossible to flee, are often deadly, and typi­cally destroy hundreds of homes and other structures.

The worst wildfires occur in naturally dry climates such as those in Australia, Cali­fornia or Spain. One of the most devastating fire seasons in Australia was the summer of 1938-39, which saw bushfires (as they’re called down under) burning all summer, with ash from the fires falling as far away as New Zealand. The Black Friday bushfires of January 13, 1939 engulfed approximately 75% of the southeast state of Victoria, killing over 60 people as described in the article from the Telegraph-Herald on the left below, and destroying 1,300 buildings; as reported:

In the town of Woodspoint alone, 21 men and two women were burned to death and 500 made destitute.  

Just a few days later, equally ferocious bushfires swept through the neighboring state of South Australia. The inferno reached the outskirts of the state capital, Adelaide, as documented in the excerpt from the Adelaide Chronicle newspaper on the right above.

Nationally, Australia’s most extensive bushfire season was the catastrophic series of fires in 1974-75 that consumed 117 million hectares (290 million acres), which is 15% of the land area of the whole continent. Fortunately, because nearly two thirds of the burned area was in remote parts of the Northern Territory and Western Australia, relatively little human loss was incurred – only six people died – though livestock and native animals such as lizards and red kangaroos suffered. An estimated 57,000 farm animals were killed.

The 1974-75 fires were fueled by abnormally heavy growth of lush grasses, following unprecedented rainfall in 1974. The fires began in the Barkly Tablelands region of Queensland, a scene from which is shown below. One of the other bushfires in New South Wales had a perimeter of more than 1,000 km (620 miles).

In the U.S., while the number of acres burned annually has gone up over the last 20 years or so, the present area consumed by wildfires is still only a small fraction of what it was back in the 1930s – just like the frequency and duration of heat waves, discussed in the preceding post. The western states, especially California, have a long history of disastrous wildfires dating back many centuries.

Typical of California conflagrations in the 1930s are the late-season fires around Los Angeles in November 1938, described in the following article from the New York Times. In one burned area 4,100 hectares (10,000 acres) in extent, hundreds of mountain and beach cabins were wiped out. Another wildfire burned on a 320-km (200-mile) front in the mountains. As chronicled in the piece, the captain of the local mountain fire patrol lamented that:

This is a major disaster, the worst forest fire in the history of Los Angeles County. Damage to watersheds is incalculable.

Northern California was incinerated too. The newspaper excerpts below from the Middlesboro Daily News and the New York Times report on wildfires that broke out on a 640-km (400-mile) front in the north of the state in 1936, and near San Francisco in 1945, respectively. The 1945 article documents no less than 6,500 separate blazes in California that year.

Pacific coast states further north were not spared either. Recorded in the following two newspaper excerpts are calamitous wildfires in Oregon in 1936 and Canada’s British Columbia in 1938; the articles are both from the New York Times. The 1936 Oregon fires, which covered an area of 160,000 hectares (400,000 acres), obliterated the village of Bandon in southwestern Oregon, while the 1938 fire near Vancouver torched an estimated 40,000 hectares (100,000 acres). Said a policeman in the aftermath of the Bandon inferno, in which as many as 15 villagers died:

If the wind changes, God help Coquille and Myrtle Point. They’ll go like Bandon did.

In 1937, a wildfire wreaked similar havoc in the neighboring U.S. state of Wyoming. At least 12 people died when the fire raged in a national forest close to Yellowstone National Park. As reported in the Newburgh News article on the left below:

The 12th body … was burned until even the bones were black beneath the skin.

and    A few bodies were nearly consumed.

The article on the right from the Adelaide Advertiser reports on yet more wildfires on the west coast, including northern California, in 1938.

As further evidence that modern-day wildfires are no worse than those of the past, the two figures below show the annual area burned by wildfires in Australia since 1905 (as a percentage of total land area, top), and in the U.S. since 1926 (bottom). Clearly, the area burned annually is in fact declining, despite hysterical claims to the contrary by the mainstream me­dia. The same is true of other countries around the world.

Next: Hottest in 125,000 Years? Dishonest Claim Contradicts the Evidence

No Evidence That Extreme Weather on the Rise: A Look at the Past - (5) Heat Waves

Recent blistering hot spells in Texas, the Pacific northwest and Europe have only served to amplify the belief that heat waves are now more frequent and longer than in the past, due to climate change. But a careful look at the evidence reveals that this belief is mistaken, and that current heat waves are no more linked to global warming than any of the other weather extremes we’ve examined.

It’s true that a warming world is likely to make heat waves more common. By definition, heat waves are periods of abnormally hot weather, last­ing from days to weeks. However, heat waves have been a regular feature of Earth’s climate for at least as long as recorded history, and heat waves of the last few decades pale in comparison to those of the 1930s – a period whose importance is frequently downplayed by the media and climate activists.

Those who dismiss the 1930s justify their position by claiming that the searing heat was confined to just 10 of the Great Plains states in the U.S. and caused by Dust Bowl drought. But this simply isn’t so. The evidence shows that the record heat of the 1930s – when the globe was also warming – extended throughout much of North America, as well as other countries such as France, India and Australia.

In the summer of 1930 two record-setting, back-to-back scorchers, each lasting 8 days, afflicted Washington, D.C. in late July and early August. During that time, 11 days in the capital city saw maximum temperatures above 38 Degrees Celsius (100 degrees Fahrenheit). Nearby Harrisonburg, Virginia roasted in July and August also, experiencing its longest heat wave on record, lasting 23 days, with 10 days of 38 Degrees Celsius (100 degrees Fahrenheit) or more.

In April the same year, an historic 6-day heat wave enveloped the whole eastern and part of the central U.S., as depicted in the figure below, which shows sample maximum temperatures for selected cities over that period. The accompanying excerpt from a New York Times article chronicles heat events in New York that July.

The hottest years of the 1930s heat waves in the U.S. were 1934 and 1936. Typical newspaper articles from those two extraordinarily hot years are set out below.

The Western Argus article on the left reports how the Dust Bowl state of Oklahoma in 1934 endured an incredible 36 successive days on which the mercury exceeded 38 degrees Celsius (100 degrees Fahrenheit) in central Oklahoma. On August 7, the temperature there climbed to a sizzling 47 degrees Celsius (117 degrees Fahrenheit). And in the Midwest, Chicago and Detroit, both cities for which readings of 32 degrees Celsius (90 degrees Fahrenheit) are normally considered uncomfortably hot, registered over 40 degrees Celsius (104 degrees Fahrenheit) the same day.

It was worse in other cities. In the summer of 1934, Fort Smith, Arkansas recorded an unbelievable 53 consecutive days with maximum temperatures of 38 degrees Celsius (100 degrees Fahrenheit) or higher. Topeka, Kansas, had 47 days, Oklahoma City had 45 days and Columbia, Missouri had 34 days when the mercury reached or passed that level. Approximately 800 deaths were attributed to the widespread heat wave.

In a 13-day heat wave in July, 1936, the Canadian province of Ontario – well removed from the Great Plains where the Dust Bowl was concentrated – saw the thermometer soar above 44 degrees Celsius (111 degrees Fahrenheit) during the longest, deadliest Canadian heat wave on record. The Toronto Star article on the right above describes conditions during that heat wave in normally temperate Toronto, Ontario’s capital. As reported:

a great mass of the children of the poverty-stricken districts of Toronto are today experiencing some of the horrors of Dante’s Inferno.

and, in a headline,

            Egg[s] Fried on Pavement – Crops Scorched and Highways Bulged      

Portrayed in the next figure are two scenes from the 1936 U.S. heat wave; the one on the left shows children cooling off in New York City on July 9, while the one on the right shows ice being delivered to a crowd in Kansas City, Missouri in August.

Not only did farmers suffer and infrastructure wilt in the 1936 heat waves, but thousands died from heatstroke and other hot-weather ailments. By some estimates, over 5,000 excess deaths from the heat occurred that year in the U.S. and another 1,000 or more in Canada; a few details appear in the two newspaper articles on the right below, from the Argus-Press and Bend Bulletin, respectively.

The article on the left above from the Telegraph-Herald documents the effect of the July 1936 heat wave on the Midwest state of Iowa, which endured 12 successive days of sweltering heat. The article remarks that the 1936 heat wave topped the previous one in 1934, when the mercury reached or exceeded the 38 degrees Celsius (100 degrees Fahrenheit) mark for 8 consecutive days.

Heat waves lasting a week or longer in the 1930s were not confined to North America; the Southern Hemisphere baked too. Adelaide on Australia’s south coast experienced a heat wave at least 11 days long in 1930, and Perth on the west coast saw a 10-day spell in 1933, as described in the articles below from the Register News and Longreach Leader, respectively.

Not to be outdone, 1935 saw heat waves elsewhere in the world. The adjacent three excerpts from Australian newspapers recorded heat waves that year in India, France and Italy, although there is no information about their duration; the papers were the Canberra Times, the Sydney Morning Herald and the Daily News.  But 1935 wasn’t the only 1930s heat wave in France. In August 1930, Australian and New Zealand (and presumably French) newspapers recounted a French heat wave earlier that year, in which the temperature soared to a staggering 50 degrees Celsius (122 degrees Fahrenheit) in the Loire valley – besting a purported record of 46 degrees Celsius (115 degrees Fahrenheit) set in southern France in 2019.  

Many more examples exist of the exceptionally hot 1930s all over the globe. Even with modern global warming, there’s nothing unusual about current heat waves, either in frequency or duration.

Next: No Evidence That Extreme Weather on the Rise: A Look at the Past - (6) Wildfires

No Evidence That Extreme Weather on the Rise: A Look at the Past - (4) Droughts

Severe droughts have been a continuing feature of the earth’s climate for millennia, but you wouldn’t know that from the brouhaha in the mainstream media over last summer’s drought in Europe. Not only was the European drought not unprecedented, but there have been numerous longer and drier droughts throughout history, including during the past century.

Because droughts typically last for years or even decades, their effects are far more catastrophic for human and animal life than those of floods which usually recede in weeks or months. The consequences of drought include crop failure, starvation and mass migration. As with floods, droughts historically have been most common in Asian countries such as China and India.

One of most devastating natural disasters in Chinese history was the drought and subsequent famine in northern China from 1928 to 1933. The drought left 3.7 million hectares (9.2 million acres) of arable land barren, leading to a lengthy famine exacerbated by civil war. An estimated 3 million people died of starvation, while Manchuria in the northeast took in 4 million refugees.

Typical scenes from the drought are shown in the photos below. The upper photo portrays three starving boys who had been abandoned by their families in 1928 and were fed by the military authorities. The lower photo shows famine victims in the city of Lanzhou.

The full duration of the drought was extensively covered by the New York Times. In 1929, a lengthy article reported that relief funds from an international commission could supply just one meal daily to:

 only 175,000 sufferers out of the 20 million now starving or undernourished.

and    missionaries report that cannibalism has commenced.

A 1933 article, an excerpt from which is included in the figure above, chronicled the continuing misery four years later:

Children were being killed to end their suffering and the women of families were being sold to obtain money to buy food for the other members, according to an official report.

Drought has frequently afflicted India too. One of the worst episodes was the twin droughts of 1965 and 1966-67, the latter in the eastern state of Bihar. Although only 2,350 Indians died in the 1966-67 drought, it was only unprecedented foreign food aid that prevented mass starvation. Nonetheless, famine and disease ravaged the state, and it was reported that as many as 40 million people were affected.

Particularly hard hit were Bihar farmers, who struggled to keep their normally sturdy plow-pulling bullocks alive on a daily ration of 2.7 kilograms (6 pounds) of straw. As reported in the April 1967 New York Times article below, a U.S. cow at that time usually consumed over 11 kilograms (25 pounds) of straw a day. A total of 11 million farmers and 5 million laborers were effectively put out of work by the drought. Crops became an issue for starving farmers too, the same article stating that:

An official in Patna said confidently the other day that “the Indian farmer would rather die than eat his seed,” but in village after village farmers report that they ate their seed many weeks ago.

The harrowing photo on the lower right below, on permanent display at the Davis Museum in Wellesley College, Massachusetts, depicts a 45-year-old farmer and his cow dying of hunger in Bihar. Children suffered too, with many forced to subsist on a daily ration of four ounces of grain and an ounce of milk.

The U.S., like most countries, is not immune to drought either, especially in southern and southeastern states. Some of the worst droughts occurred in the Great Plains states and southern Canada during the Dust Bowl years of the 1930s.

But worse yet was a 7-year uninterrupted drought from 1950 to 1957, concentrated in Texas and Oklahoma but eventually including all the Four Corners states of Arizona, Utah, Colorado and New Mexico, as well as eastward states such as Missouri and Arkansas. For Texas, it was the most severe drought in recorded history. By the time the drought ended, 244 of Texas' 254 counties had been declared federal disaster areas.

Desperate ranchers resorted to burning cactus, removing the spines, and using it for cattle feed. Because of the lack of adequate rainfall, over 1,000 towns and cities in Texas had to ration the water supply. The city of Dallas opened centers where citizens could buy cartons of water from artesian wells for 50 cents a gallon, which was more than the cost of gasoline at the time.

Shown in the photo montage on the left below are various scenes from the Texas drought. The top photo is of a stranded boat on a dry lakebed, while the bottom photo illustrates once lakeside cabins on a shrinking Lake Waco; the middle photo shows a car being towed after becoming stuck in a parched riverbed. The newspaper articles on the right are from the West Australian in 1953 (“Four States In America Are Hit By Drought”) and the Montreal Gazette in 1957.

Reconstructions of ancient droughts using tree rings or pollen as a proxy reveal that historical droughts were even longer and more severe than those described here, many lasting for decades – so-called megadroughts. This can be seen in the figure below, which shows the pattern of dry and wet periods in drought-prone California over the past 1,200 years.

Next: No Evidence That Extreme Weather on the Rise: A Look at the Past - (5) Heat Waves

No Evidence That Extreme Weather on the Rise: A Look at the Past - (3) Floods

Devastating 2022 floods in Pakistan that affected 33 million people and damaged or destroyed over 2 million homes. A 2021 once-in-a-millennium flood in Zhengzhou, China that drowned passengers in a subway tunnel. Both events were trumpeted by the mainstream media as unmistakable signs that climate change has intensified the occurrence of weather extremes such as major floods, droughts, hurricanes, tornadoes and heat waves.

But a close look at history shows that it’s the popular narrative that is mistaken. Just as with hurricanes and tornadoes, floods today are no more common nor deadly or disruptive than any of the thousands of floods in the past, despite heavier precipitation in a warming world.

Floods tend to kill more people than hurricanes or tornadoes, either by drowning or from subsequent famine, although part of the death toll from landfalling hurricanes is often drownings caused by the associated storm surge. Many of the world’s countries regularly experience flooding, but the most notable on a recurring basis are China, India, Pakistan and Japan.

China has a long history of major floods going back to the 19th century and before. One of the worst was the flooding of the Yangtze and other rivers in 1931 that inundated approximately 180,000 square kilometers (69,500 square miles) following rainfall in July of over 610 mm (24 inches). That was a far greater area flooded than the 85,000 square kilometers (33,000 square miles) underwater in Pakistan’s terrible floods last year, and affected far more people – as many as 53 million.

The extent of the watery invasion can be seen in the top two photos of the montage on the left; the bottom photo displays the havoc wrought in the city of Wuhan. A catastrophic dike failure near Wuhan left almost 800,000 people homeless and covered the city with several meters of water for months.

Chinese historians estimate the countrywide death toll at 422,000 from drowning alone; an additional 2 million people reportedly died from starvation or disease resulting from the floods, and much of the population was reduced to “eating tree bark, weeds, and earth.” Some sold their children to survive, while others resorted to cannibalism.

 The disaster was widely reported. The Evening Independent wrote in August 1931:

Chinese reports … indicate that the flood is the greatest catastrophe the country has ever faced.

The same month, the Pittsburgh Post-Gazette, an extract from which is shown in the figure below, recorded how a United News correspondent witnessed:

thousands of starving and exhausted persons sitting motionless on roofs or in shallow water, calmly awaiting death.

The Yangtze River flooded again in 1935, killing 145,000 and leaving 3.6 million homeless, and also in 1954 when 30,000 lost their lives, as well as more recently. Several other Chinese rivers also flood regularly, especially in Sichuan province.

The Pakistan floods of 2022 are the nation’s sixth since 1950 to kill over 1,000 people. Major floods afflicted the country in 1950, 1955, 1956, 1957, 1959, throughout the 1970s, and in more recent years. Typical flood scenes are shown in the photos below, together with a New York Times report of a major flood in 1973.

Monsoonal rains in 1950 led to flooding that killed an estimated 2,900 people across the country and caused the Ravi River in northeastern Pakistan to burst its banks; 10,000 villages were decimated and 900,000 people made homeless.

In 1973, one of Pakistan’s worst-ever floods followed intense rainfall of 325 mm (13 inches) in Punjab (which means five rivers) province, affecting more than 4.8 million people. The Indus River – of which the Ravi River is a tributary – became a swollen, raging torrent 32 km (20 miles) wide, sweeping 300,000 houses and 70,000 cattle away. 474 people perished.

In an area heavily dependent on agriculture, 4.3 million bales of the cotton crop and hundreds of millions of dollars worth of stored wheat were lost. Villagers had to venture into floodwaters to cut fodder from the drowned and ruined crops in order to feed their livestock. Another article on the 1973 flood in the New York Times reported the plight of flood refugees:

In Sind, many farmers, peasants and shopkeepers fled to a hilltop railway station where they climbed onto trains for Karachi.

Monsoon rainfall of 580 mm (23 inches) just three years later in July and September of 1976, again mostly in Punjab province, caused a flood that killed 425 and affected another 1.7 million people. It’s worth noting here that the 1976 deluge far exceeded the 375 mm (15 inches) of rain preceding the massive 2022 flood, although both inundated approximately the same area. The 1976 flood affected a total of 18,400 villages.

A shorter yet deadly flood struck the coastal metropolis of Karachi the following year in 1977, after 210 mm (8 inches) of rain fell on the city in 12 hours. Despite its brief duration, the flood drowned 848 people and left 20,000 homeless. That same year, the onslaught of floods in the country prompted the establishment of a Federal Flood Commission.

The figure below shows the annual number of flood fatalities in Pakistan from 1950 to 2012, which includes drownings from cyclones as well as monsoonal rains.

Many other past major floods, in India, Japan, Europe and other countries, are recorded in the history books, all just as devastating as more recent ones such as those in Pakistan or British Columbia, Canada. Despite the media’s neglect of history, floods are not any worse today than before.

Next: No Evidence That Extreme Weather on the Rise: A Look at the Past - (4) Droughts

No Evidence That Extreme Weather on the Rise: A Look at the Past - (2) Tornadoes

After a flurry of tornadoes swarmed the central U.S. this March, the media were quick to fall into the trap of linking the surge to climate change, as often occurs with other forms of extreme weather. But there is no evidence that climate change is causing tornadoes to become more frequent and stronger, any more than hurricanes are increasing in strength and number, as I discussed in my previous post.

Indeed, there are ample examples of past tornadoes just as or more violent and deadly than today’s, but conveniently ignored by believers in the narrative that weather extremes are on the rise.

Like hurricanes, tornadoes are categorized according to wind speed, using the Fujita Scale going from EF0 to EF5 (F0 to F5 before 2007); EF5 tornadoes attain wind speeds up to 480 km per hour (300 mph). More terrifying than hurricanes because they often arrive without warning, tornadoes also have the awesome ability to hurl cars, struc­tural debris, animals and even people through the air.

In the U.S., tornadoes cause about 80 deaths and more than 1500 injuries per year. The deadliest  episode of all time in a sin­gle day was the “tri-state” outbreak in 1925, which killed over 700 peo­ple and resulted in the most damage from any tornado outbreak in U.S. history. The photo montage on the right shows one of the 12 or more tornadoes observed in Missouri, Illinois and Indiana approaching a farm (top); some of the 154 city blocks obliterated in Murphysboro, Illinois (middle); and the wreckage of Murphysboro’s Longfellow School, where 17 children were killed (bottom).                                                                                     Unlike the narrow path of most tornadoes, the swath of destruction wrought by the main F5 tornado was up to 2.4 km (1.5 miles) wide. Amazingly, the ferocious storm persisted for a distance of 353 km (219 miles) in its 3 ½-hour lifetime. Together with smaller F2, F3 and F4 tornadoes, the F5 tri-state tornado destroyed or almost destroyed numerous towns. Another 33 schoolchildren died in De Soto, Illinois when their school collapsed. De Soto’s deputy sheriff was sucked into the funnel cloud, never to be seen again.

Newspapers of the day chronicled the devastation. United Press described how:

a populous, prosperous stretch of farms, villages and towns … suddenly turned into an inferno of destruction, fire, torture and death.

The Ellensburg Daily Record reported that bodies were carried as far as a mile by the force of the main tornado.

Over three successive days in May 1953, at least 10 different U.S. states were struck by an outbreak of more than 33 tornadoes, the deadliest being an F5 tornado that carved a path directly though the downtown area of Waco, Texas (photo immediately below). Believing falsely that their city was immune to tornadoes, officials had not insisted on construction of sturdy buildings, many of which collapsed almost immediately and buried their occupants.

The same day, a powerful F4 tornado hit the Texas city of San Angelo, causing catastrophic damage. As mentioned in the accompanying newspaper article below, an American Associated Press correspondent reported “a scene of grotesque horror” in Waco and described how San Angelo’s business area was “strewn with kindling wood.”

June that year saw a sequence of powerful tornadoes wreak havoc across the Midwest and New England, the latter being well outside so-called Tornado Alley. An F5 tornado in Flint, Michigan (upper photo in figure below) and an F4 tornado in Worcester, Massachusetts (lower photo) each caused at least 90 deaths and extensive damage. The accompanying newspaper article, in Australia’s Brisbane Courier-Mail, mentions how cars were “whisked about like toys.”

Nature’s wrath was on display again in the most ferocious tornado outbreak ever recorded, spawning a total of 30 F4 or F5 tornadoes – the so-called Super Outbreak – in April 1974. A total of 148 tornadoes of all strengths struck 13 states in Tornado Alley and the Canadian province of Ontario over two days; their distribution and approximate path lengths are depicted in the left panel of the next figure.

The photos on the right illustrate the massive F5 tornado, the worst of the 148, that bore down on Xenia, Ohio (population 29,000, top) and the resulting damage (middle and bottom). The Xenia tornado was so powerful that it tossed freight trains on their side, and even dropped a school bus onto a stage where students had been practicing just moments before. Wrote the Cincinatti Post of the devastation:

Half of Xenia is gone.

In Alabama, two F5 tornadoes, out of 75 that struck the state, hit the town of Tanner within 30 minutes; numerous homes, both brick and mobile, were chewed up or swept away. In Louisville, Kentucky, battered by an F4 tornado, a Navy veteran who lost his home lamented in the Louisville Times that:

only Pearl Harbor was worse.

In all, the Super Outbreak caused 335 fatalities and over 6,000 injuries.

The following figure shows that the annual number of strong tornadoes (EF3 or greater) in the U.S. has declined dramatically over the last 72 years. In fact, the average number of strong tor­nadoes annually from 1986 to 2017 – a period when the globe warmed by about 0.7 degrees Celsius (1.3 degrees Fahrenheit) – was 40% less than from 1954 to 1985, when warming was much less. That turns the extreme weather caused by climate change narrative on its head.

Hat tip: Tony Heller @TonyClimate, who discovered the two newspaper articles above.

Next: No Evidence That Extreme Weather on the Rise: A Look at the Past - (3) Floods

No Evidence That Extreme Weather on the Rise: A Look at the Past - (1) Hurricanes

The popular but mistaken belief that today’s weather extremes are more common and more intense because of climate change is becoming deeply embedded in the public consciousness, thanks to a steady drumbeat of articles in the mainstream media and pronouncements by luminaries such as President Biden in the U.S., Pope Francis and the UN Secretary-General.

But the belief is wrong and more a perception than reality. An abundance of scientific evidence demonstrates that the frequency and severity of floods, droughts, hurricanes, tornadoes, heat waves and wildfires are not increasing, and may even be declining in some cases. That so many people think otherwise reflects an ignorance of, or an unwillingness to look at, our past climate. Collective memories of extreme weather are short-lived.  

In this and subsequent posts, I’ll present examples of extreme weather over the past century or so that matched or exceeded anything we’re experiencing in the present-day world. I’ll start with hurricanes.

The deadliest U.S. hurricane in record­ed history struck Galveston, Texas in 1900, killing an estimated 8,000 to 12,000 people. Lacking a protective seawall built later, the thriving port was completely flattened (photo on right) by winds of 225 km per hour (140 mph) and a storm surge exceeding 4.6 meters (15 feet). With almost no automobiles, the hapless populace could flee only on foot or by horse and buggy. Reported the Nevada Daily Mail at the time:

Residents [were] crushed to death in crumbling buildings or drowned in the angry waters.

Hurricanes have been a fact of life for Americans in and around the Gulf of Mexico since Galveston and before. The death toll has come down over time with improvements in planning and engineering to safeguard structures, and the development of early warning sys­tems to allow evacuation of threatened communities.

Nevertheless, the frequency of North Atlantic hurricanes has been essentially unchanged since 1851, as seen in the following figure. The apparent heightened hurricane ac­tivity over the last 20 years, particularly in 2005 and 2020, simply reflects improvements in observational capabilities since 1970 and is unlikely to be a true climate trend, say a team of hurricane experts.

As you can see, the incidence of major North Atlantic hurricanes in recent decades is no higher than that in the 1950s and 1960s. Ironically, the earth was actually cooling during that period, unlike today.

Of notable hurricanes during the active 1950s and 1960s, the deadliest was 1963’s Hurricane Flora that cost nearly as many lives as the Galveston Hurricane. Flora didn’t strike the U.S. but made successive landfalls in Tobago, Haiti and Cuba (path shown in photo on left), reaching peak wind speeds of 320 km per hour (200 mph). In Haiti a record 1,450 mm (57 inches) of rain fell – comparable to what Hurricane Harvey dumped on Houston in 2017 – resulting in landslides which buried whole towns and destroyed crops. Even heavier rain, up to 2,550 mm (100 inches), devastated Cuba and 50,000 people were evacuated from the island, according to the Sydney Morning Herald.

Hurricane Diane in 1955 walloped the North Carolina coast, then moved north through Virginia and Pennsylvania before ending its life as a tropical storm off the coast of New England. Although its winds had dropped from 190 km per hour (120 mph) to less than 55 km per hour (35 mph) by then, it spawned rainfall of 50 cm (20 inches) over a two-day period there, causing massive flooding and dam failures (photo to right). An estimated total of 200 people died. In North Carolina, Diane was but one of three hurricanes that struck the coast in just two successive months that year.

In 1960, Hurricane Donna moved through Florida with peak wind speeds of 285 km per hour (175 mph) after pummeling the Bahamas and Puerto Rico. A storm surge of up to 4 meters (13 feet) combined with heavy rainfall caused extensive flooding all across the peninsula (photo on left). On leaving Florida, Donna struck North Carolina, still as a Category 3 hurricane (top wind speed 180 km per hour or 110 mph), and finally Long Island and New England. NOAA (the U.S. National Oceanic and Atmospheric Administration) calls Donna “one of the all-time great hurricanes.”

Florida has been a favorite target of hurricanes for more than a century. The next figure depicts the frequency by decade of all Florida landfalling hurricanes and major hurricanes (Category 3, 4 or 5) since the 1850s. While major Florida hurricanes show no trend over 170 years, the trend in hurricanes overall is downward – even in a warming world.

Hurricane Camille in 1969 first made landfall in Cuba, leaving 20,000 people homeless. It then picked up speed, smashing into Mississippi as a Category 5 hurricane with wind speeds of approximately 300 km per hour (185 mph); the exact speed is unknown because the hurricane’s impact destroyed all measuring instruments. Camille generated waves in the Gulf of Mexico over 21 meters (70 feet) high, beaching two ships (photo on right), and caused the Mississippi River to flow backwards. A total of 257 people lost their lives, the Montreal Gazette reporting that workers found:

a ton of bodies … in trees, under roofs, in bushes, everywhere.

These are just a handful of hurricanes from our past, all as massive and deadly as last year’s Category 5 Hurricane Ian which deluged Florida with a storm surge as high as Galveston’s and rainfall up to 685 mm (27 inches); 156 were killed. Hurricanes are not on the rise today.

Next: No Evidence That Extreme Weather on the Rise: A Look at the Past - (2) Tornadoes

CRED’s “2022 Disasters in Numbers” Report Is a Disaster in Itself

The newly released 2022 annual disasters report from the highly acclaimed international agency, CRED (Centre for Research on the Epidemiology of Disasters), is even more dishonest than its 2021 report which I reviewed in a previous post. The 2022 report contains numerous statements that cannot be justified by the evidence, and demonstrates a misunderstanding of basic statistics which is puzzling for an organization that collects and analyzes data.

The most egregious statements involve the death toll from weather-related disasters. In one section of the report, CRED cites the well-known fact that mortality from natural disasters is 98% lower today than a century earlier. Although this is actually based on CRED’s EM-DAT (Emergency Events Database), the 2022 report gripes that “A more careful examination of mortality statistics indicates that this percentage may be misleading. Misinterpreting statistics could be harmful if it supports a discourse minimizing the importance of climate action.”

Laughably, it is CRED’s new report that is misleading and misinterprets statistics. This is evident from the following two figures from the report and the accompanying commentary. Figure A shows the global annual number of deaths per decade from natural disasters between 1900 and 2020, compiled from 12,223 records in the EM-DAT da­tabase, while the highly misleading Figure B shows the same data excluding the 50 deadliest disasters.

In Figure A, it is clear that disaster-related deaths have been falling since the 1920s and are now approaching zero. Nevertheless, the 2022 CRED report makes the weak argument that if the 1910s were taken as the comparison baseline instead of the 1920s, the 98% fall would be only 30%. But a close look at the data reveals a total of 1.27 million deaths recorded in the year 1900, yet almost none at all from 1901 to 1919 (less than 50,000 in most years) – suggesting some deficiency in data collection during that period.

However, far more blatant is the report’s manipulation of the data in Figure A, by removing the 50 deadliest disasters from the dataset and then claiming that disaster deaths show “a positive mortality trend” over the last century, as depicted in Figure B.

Such subterfuge is both dishonest and statistically flawed. Some disasters are more deadly, some less; the only way to present any trend honestly is to include all the data. A fundamental tenet of the scientific method is that you can’t ignore any piece of evidence that doesn’t fit your narrative, simply because it’s inconvenient. And statistically, a disaster trend is a disaster trend, regardless of the disaster magnitude. If anything, the deadliest disasters – not the least deadly, as plotted in Figure B – carry the most weight in illustrating any trend in deaths.

While CRED sheepishly admits that Figure B “does not necessarily mean that we now have firm evidence that disaster-related mortality is increasing,” it gives away its true motive in presenting the figure by musing whether the fictitious positive trend is “supported by other drivers, e.g., population growth in exposed areas and climate change.”

The report goes on to argue that the main trend observed in Figure A is a result of five drought-induced famines, which each caused more than one million deaths from the 1920s to the 1960s. This statement is also deceptive, as can be seen from the figure below. The figure is similar to CRED’s Figure A and based on the same EM-DAT database, but breaks down the number of people killed in each decade into disaster category and corrects for population increase over time; the same data uncorrected for population increase show exactly the same features.

You can see that deaths from drought were dominant in the 1900s, 1920s, 1940s, 1960s and 1980s, but not the 1910s, 1930s, 1950s and 1970s. So CRED’s argument that the strong downward trend in Figure A is due to a large number of drought-induced famine deaths between 1920 and 1970 is nonsense.

Another section of the CRED report presents disaster death data for 2022, which is summarized in the following figure from the report. CRED comments that “the total death toll of 30,704 in 2022 was three times higher than in 2021 but below the 2002-2021 average of 60,955 deaths,” both of which are correct statements. However, the report then goes on to claim that the relatively high 2002-2021 average is “influenced by a few mega-disasters” and that “a more useful comparison [is that] the 2022 toll is almost twice the 2002-2021 median of 16,011 deaths.”

Again, these are meaningless comparisons that demonstrate an ignorance of statistics. The individual yearly death totals are unrelated – independent events in the language of statistics – so assigning any statistical significance to the 30,704 deaths in 2022 being lower than the long-term average, or higher than the long-term median, is invalid. CRED’s attempt to fit its data to a narrative emphasizing “the importance of climate action” falls flat.

The statistical inadequacies of CRED’s comparisons are also made clear by examining the recent trend in CRED’s EM-DAT data. The next figure shows the yearly number of climate-related disasters globally from 2000 through 2022 by major category. The disasters are those in the climatological (droughts, glacial lake outbursts and wildfires), meteorological (storms, extreme temperatures and fog), and hydrological (floods, landslides and wave action) categories.

As can be seen, the total number of climate-related disasters exhibits a slowly declining trend since 2000 (red line), falling by 4% over 23 years.

Next: Challenges to the CO2 Global Warming Hypothesis: (8) The Antarctic Centennial Oscillation as the Source of Global Warming

No Evidence That Cold Extremes Are Becoming Less Frequent

The IPCC (Intergovernmental Panel on Climate Change), whose assessment reports are the voice of authority for climate science, errs badly in its Sixth Assessment Report (AR6) by claiming that cold weather extremes have become less frequent and severe. While that may be expected in a warming world, observational evidence shows that in fact, cold extremes are on the rise and may actually have become more severe.

Cold extremes include abnormally low temperatures, prolonged cold spells, unusually heavy snowfalls and longer winter sea­sons. That cold extremes are indeed increasing has been chronicled in detail by environmental scientist Madhav Khandekar in several recent research papers (here, here and here). While the emphasis of Khandekar’s publications has been on harsh winters in North America, he has catalogued cold extremes in South America, Europe and Asia as well.

The figure below shows the locations of 4,145 daily low-temperature records broken or tied in the northeastern U.S. during the ice-cold February of 2015; that year tied with 1904 for the coldest Janu­ary to March period in the northeast, in records extending back to 1895. Of the 4,145 records, 3,573 were new record lows and the other 572 tied previous records.

Examples of cold extremes in recent years abound (see here and here). During the 2020 southern winter and northern summer, the Australian island state of Tasmania recorded its most frigid winter minimum ever, exceeding the previous low of −13.0 degrees Celsius (8.6 degrees Fahrenheit) by 1.2 degrees Celsius (2.2 degrees Fahrenheit); Norway endured its chilliest July in 50 years; neighboring Sweden shivered through its coldest sum­mer since 1962; and Russia was also bone-chilling cold.

In the northern autumn of 2020, bitterly cold temperatures afflicted many communities in the U.S. and Canada. The north­ern U.S state of Minnesota experienced its largest early-season snowstorm in recorded history, going back about 140 years. And in late December, the subfreezing polar vortex began to expand out of the Arctic.

Earlier in 2020, massive snowstorms covered much of Patagonia in more than 150 cm (60 inches) of snow, and buried alive at least 100,000 sheep and 5,000 cattle. Snowfalls not seen for decades occurred in other parts of South America, and in South Africa, southeastern Australia and New Zealand.

A 2021 example of a cold extreme was the North American cold wave in February, which brought record-breaking subfreez­ing temperatures to much of the central U.S., as well as Canada and northern Mexico. Texas experienced its coldest February in 43 years; the frigid conditions lasted several days and resulted in widespread power outages and damage to infrastructure. Curiously, the Texan deep freeze was ascribed to global warming by a team of climate scien­tists, who linked it to stretching of the Arctic polar vortex.

Other exceptional cold extremes in 2021 included the lowest average UK minimum temperature for April since 1922; record low temperatures in both Switzerland and Slove­nia the same month; the coldest winter on record at the South Pole; and an all-time high April snowfall in Belgrade, in record books dating back to 1888.

In 2022, Australia and South America saw some of their coldest weather in a century. In May, Australia experienced the heaviest early-season mountain snow in more than 50 years. In June, Brisbane in normally temperate Queensland had its coldest start to winter since 1904. And in December, the state of Victoria set its coldest summer temperature record ever.

South America also suffered icy conditions in 2022, after an historically cold winter in 2021 which decimated crops. The same Antarctic cold front that froze Australia in May brought bone-numbing cold to northern Argentina, Paraguay and southern Brazil; Brazil’s capital Brasilia logged its lowest temperature in recorded history.

In December 2022, the U.S. set 126 monthly low-temperature records, while century-old low-temperature records tumbled in neighboring Canada. This followed all-time record-breaking snow in Japan, extra-heavy snow in the Himalayas which thwarted mountain climbers there, and heavy snow across China and South Korea.

Clearly, cold extremes are not going away or becoming less severe. And frequent statements by the mainstream media linking cold extremes to global warming are absurd, although such statements may fit the popular belief that global warming causes weather extremes in general. As I have explained in numerous blog posts and reports, this belief is mistaken and there is no evidence that weather extremes are worsening because of climate change.

Extreme weather conditions are produced by natural patterns in the climate system, not global warming. Khandekar links cold extremes to the North Atlantic and Pacific Decadal Oscil­lations, and possibly to solar activity.

Next: Global Warming from Food Production and Consumption Grossly Overestimated