Global Warming of 1.5°C (1)

Can we limit global warming to 1.5°C? What would it require? Would there be real advantages compared to letting earth’s climate warm more than that? These are the questions that the Intergovernmental Panel on Climate Change (IPCC) Special Report 15, Global Warming of 1.5°C seeks to answer. IPCC is, of course, discussing human-induced global warming, not natural climate change. I will discuss their answer to the first question in this post, and the other two questions in the next post.

Let’s start by understanding what we can expect from this report.

Figure 1 shows an image of something. It appears to be something white. It is too far away and out of focus to see more. Figure 2 moves a little closer. Now it is possible to see that that it is a white rectangle with some gray smudges on it.

Figure 3 moves a little closer. You can’t see the whole of the white rectangle, but the gray smudges can now be seen to be a word: “Titanic.” But the writing is still out of focus. Figure 4 moves a little closer still, and the writing is now in clear focus.

Over the years, the IPCC has issued a series of reports on global warming/climate change. Over that time, the basic understanding of global warming has not changed. But as we have gotten closer, it has come more clearly into focus, and it has become possible to make out details that we couldn’t see before. We still don’t have global warming in full focus; we’re not to Figure 4 yet. But it has become possible to ask specific questions and give answers that, while not yet fully specific and detailed, are getting there. So, Global Warming of 1.5°C doesn’t contain radical new understandings. Rather, it is more detailed, and that is useful.

By the way, I chose the word “Titanic” on purpose. That ship was not built to survive a catastrophic iceberg strike, substandard steel may have been used to construct her, and she didn’t have enough lifeboats for all of the passengers. The captain denied the risk and sailed through the night into an iceberg field. By the time the iceberg was spotted dead ahead in the middle of the night, it was too late to turn and too late to stop. By that point, nothing they could do could change their fate: the Titanic was going to hit that berg and sink, and thousands were going to die.

Did I really write that? That’s really catastrophic, apocalyptic even! According to the IPCC report, we are very, very close to being like the Titanic. It may already be too late, but perhaps if we try really, really hard, it isn’t. Read on.

GMST 1850-Present

Figure 5. Global Mean Surface Temperature 1850-Present. Source: IPCC 2018.

Human activity has already caused our planet’s global mean surface air temperature (GMST) to warm approximately 1°C (1.8°F) since pre-industrial times, according to the report. GMST is increasing by about 0.2°C (0.36°F) per decade. The rate of warming appears to be increasing. Figure 5 shows the temperature trend. The gray line shows the monthly temperatures in the datasets. The orange line shows the change forced by both humans and nature combined, while the yellow line shows the change forced by human activities alone (it is hard to see because it is embedded in the yellow band, look closely) .

GMST is an average across the globe. Some regions have warmed more than others. For instance, the temperature over land has increased more than the temperature over water; 40-60% of human population lives in regions that have already warmed 1.5°C (2.7°F) or more. Thus, a 1.5°C increase in GMST implies a larger than 1.5°C increase over land, with a smaller increase over the ocean.

Past emissions (through 2017) are probably not sufficient to cause GMST to increase more than 1.5°C. Therefore, warming limited to 1.5°C is theoretically possible if human emissions are immediately reduced. Two ways in which the 1.5°C limit could be achieved are discussed in the report. One reduces GHG emissions sufficiently quickly so that the 1.5°C limit is never exceeded. The other would allow a small overshoot of the limit, with temperature then being brought back within the limit by removing carbon dioxide from the atmosphere.

Reduction Pathways

Figure 6. GHG Emission Reduction Pathways. Source: IPCC 2018.

To limit the increase of GMST to 1.5°C with no overshoot would require GHG emissions of no more than 25-30 billion metric tons of CO2e per year in 2030 (compared to estimates that under business as usual they will be 50-58 billion metric tons per year). And GHG EMISSIONS WOULD NEED TO DECLINE TO NET ZERO BY 2050. That’s right – no net GHG emissions by 2050. Figure 6 shows the reductions over time in emissions of CO2, methane, black carbon (soot), and nitrous oxide consistent with a 1.5°C increase in GMST.

The no-net-emissions requirement could be met by two strategies: the first would involve reducing emissions themselves. Reducing emissions at this magnitude would require near-total transformations of our energy, transportation, and agricultural systems. The second would involve widely deploying carbon dioxide removal mechanisms. The only currently proven mechanism for removing carbon dioxide from the atmosphere is revegetation, especially reforestation. Attempts to add carbon capture and sequestration to power plants have not yet proven viable.

The limits agreed to in the Paris Climate Agreement are not sufficient to limit the increase in GMST to 1.5°C.

In the next post, I will look at what the report has to say about strategies to meet the limit, and what the costs and benefits might be.


Intergovernmental Panel on Climate Change. 2018. Global Warming of 1.5°C (Draft). Downloaded 11/24/2018 from

Back in the Saddle

The last original post I made on this blog was September 27. I want to thank my readers for being patient with me while I struggled with my wife’s sudden illness and death. It has been difficult, and there has been an unbelievable amount to do. Hopefully I can now pick the blog back up and resume my once-weekly schedule of posts on Thursday mornings.

A couple of very important reports have come out since October 1 The first was the IPCC Special Report 15, Global Warming of 1.5°C. This report had been published online a few months previously, but had not completed the review process by the involved nations. It now has, and it is available for distribution and quotation. It still has some editing and layout revisions to undergo before the final copy is available, but the content has been approved.

This document is different from the IPCC assessment reports in that it seeks to answer a couple of basic questions: is it possible to limit the increase in global mean surface air temperature (GMST) to 1.5°C? What would be required to keep the increase within that limit? What would be the benefits and costs of doing so compared to a larger increase? It’s an important document, and the news is not encouraging.

The second is the Fourth National Climate Assessment published by the U.S. Global Change Research Program. This is the official U.S. government report on climate change, published every two years. This year, the authors felt that their understanding of how climate change would play-out has increased sufficiently to allow them to address its economic effect.

I’m not in love with economic analyses of environmental issues. Sometimes it is necessary to do uneconomic things. If all the values and services provided by the environment could be given an economic value that reflected their true worth, such an analysis might be possible. But we haven’t achieved that level of wisdom yet, so such analyses can be misleading. Still, this is an official government publication, and it is used by policymakers. Cost vs. benefit is an important consideration in making national policy, so it is important to have estimates of just how much climate change is going to impact the economy if we let it rampage unchecked. Again, the news is not good.

I will start with the IPCC report in my next post, and then move on to the National Climate Assessment.

Thanks for sticking with me.

The Election of 2016

I’m unable to write new posts for a couple of weeks. This is a rerun of one I published on 11/11/16, just after the election.

I expect Donald Trump to be a poor president. But it isn’t the end of the world, at least not yet.

It is time I addressed the elections of 2016 in this blog. Politics is far outside my focus here, yet the election has been a momentous event, and I don’t think I should ignore it completely.

I did not support Mr. Trump, nor did I support the slate of Republicans who ran the table in Missouri’s statewide elections. The environment is not a high priority for them, and some are overtly hostile to environmental concerns. I fear that their election is a mistake, and we will pay for it for many decades. Yet, the future is not set in stone, and we don’t yet know whether their actions in office will match their terrible rhetoric.

With that said, do not forget that this country has weathered many storms. The situation is not yet as dire as it was in 1812, when a foreign power invaded and burned our capital. Nor is it as dire as in 1860, when the South seceded from The Union. It is not as dire as it was in 1933, when unemployment was about 25%. It is not as dire as it was in the 1940s, when World War II broke out. “Keep calm and carry on.” It was good advice then, and it is good advice now.

We need to accept that Mr. Trump will be our next president, then we need to start moving into the future. There are a couple of things we need to do. We need to make our voices heard, and we need to be loud and clear about what is acceptable and what is not.

  • Racism and bigotry are not okay, NEVER, EVER.
  • Attacks on free speech are not okay, NEVER, EVER.
  • Attacks on facts and science as the basis of knowledge are not okay, NEVER, EVER.
  • Bullying and abuse are not okay, NEVER, EVER.
  • Ignoring or being hostile to the environment is not okay, NEVER, EVER.
  • (Today I would add that attacking the rule of law and the institutions that uphold it is not okay, NEVER, EVER.)

At the same time, the election results suggest that the status quo is not working for many people. When I was young, it was a fine thing to be liberal and progressive. But today in Missouri, the word “liberal” has become an epithet, and political adds hurl the word at opponents like mud. Did liberals earn that scorn? How? I don’t see much future in liberal policies until we are willing to look at what we have done that has offended so many.

I don’t have much hope that Mr. Trump is going to be a good president, but perhaps he will surprise us. Some who were supposed to become great presidents didn’t (Herbert Hoover), and some who were supposed to be lousy presidents became great (Abraham Lincoln). We’ll have to wait and see. Until then, here in St. Louis the sun is shining and it is a spectacular fall day. The world hasn’t failed yet. Keep calm and carry on.

Since publishing the above post, Mr. Trump has, in my opinion, proved himself to be one of the worst presidents in American history. He has had some policy successes, but in the process he has done huge damage to our country, both at home and abroad. It appears to me that he is intentionally trying to destroy the Environmental Protection Agency, or to make it a puppet of the corporations it is supposed to regulate. Worst of all, he has attacked and deliberately weakened the institutions that uphold the rule of law in America. This is damage that will be hard to repair, and we will pay for it for decades to come.

The Opiate of Public Opinion

I’m unable to write new posts for a couple of weeks. This is a rerun of one I published 11/6/2016.


On 11/2/2016, the New York Times published an article by Farhad Manjoo on How the Internet Is Loosening Our Grip on the Truth. Manjoo wrote that the Internet, instead of delivering us into a “marketplace of ideas,” has led us into echo chambers dominated by preconceptions and biases. In those echo chambers, we hear only our preexisting beliefs endlessly repeated. Facts get evaluated through the lens of belief, and if they disagree with belief, they get ignored or denied. While not a new problem, he believes that the Internet is magnifying it. If he is right, it represents a serious problem for our democracy, which relies on the judgement of an informed public.

Psychologists and sociologist have known for a long time that we tend to see the world in ways that confirm preexisting beliefs, and they call it “confirmation bias.” There are many theories about why. One of my favorite explanations comes from a study that was done by Drew Westen and his colleagues during the 2004 presidential election. He took equal numbers of Republicans and Democrats and showed them self-contradictory statements by both Bush and Kerry. It took the subjects a moment to process what they had seen, but to nobody’s surprise, the Republicans explained away Bush’s contradiction and criticized Kerry for his. The Democrats did the inverse: they explained away Kerry’s contradiction, but criticized Bush for his. Thus, the subjects seemed to twist what they had seen, almost as if in a kaleidoscope, until it matched their preexisting beliefs.

So far, nothing new, just one of many demonstrations of confirmation bias. However, Westen added a wrinkle: while all this was going on, he had his subjects’ heads in a functional magnetic resonance imaging machine (fMRI). An fMRI measures the uptake of glucose by regions of the brain. Regions of the brain increase their uptake of glucose when they are being used. Thus, an fMRI provides a picture of which regions of the brain are active. The regions “light up” with color on the fMRI display.

Westen expected that while his subjects were busy processing, brain regions associated with thinking would light up. And when they achieved a resolution and spoke their opinion, then the brain would go quiet, at rest. But that is not what he found. He found that while his subjects were thinking, the regions of the brain that lit up were the regions associated with emotional pain. And when they spoke their opinion, the brain didn’t go quiet. Instead, the pleasure centers of the brain lit up. These are the regions of the brain that light up when a person takes a dose of a narcotic.

Thus, Westen’s conclusion was that we experience facts that contradict our preconceived ideas as pain. And when we twist reality to conform to our ideas, the pain goes away, and we get a “hit” of pleasure like taking a narcotic.

No wonder we do it. And those echo chambers that Manjoo mentioned? That’s where we go, like opium dens, to get hit after hit of our favorite narcotic.

That is why Mogreenstats focuses on large-scale studies. I diverge into other stuff from time-to-time, but mostly I focus on statistics about the environment. I see it as an antidote to the propaganda one hears about the environment, whichever echo chamber it comes from. No, these studies aren’t perfect. But I see them as being as close as one can get to actual facts. If we don’t base our public policy on facts, it is not likely to be effective.


Westen, Drew, Pavel Blagov, Keith Harenski, Clint Kilts, and Stephan Hamann. 2006. “Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgement in the 2004 U.S. Presidential Election. Journal of Cognitive Neuroscience. 18(11), p. 1947-1958. Viewed online 11/2/2016 at,%20Kilts%202006%20J%20Cognit%20Neurosci.pdf.

Wikipedia. Drew Westen. Viewed online 11/2/2016 at

Vehicle Miles Driven at a New All-Time High

From time-to-time, I report the number of vehicle miles traveled in the USA. I do it because vehicle travel is one of the largest contributors to air pollution and greenhouse gas emissions. You can try to make vehicles cleaner, but the bottom line is that they require energy to operate, and that energy mostly comes from fossil fuel. The more miles traveled, the more sulfur dioxide, nitrogen oxides, particulate matter, ozone, and carbon dioxide get emitted into the atmosphere.

Figure 1. Data source: Department of Transportation.

For road planning purposes, the Department of Transportation estimates the number of vehicle miles traveled in the USA, and breaks it down by state. Figure 1 shows the total vehicle miles traveled in the United States from 1980 to 2014, the most recent year for which data is available. The blue portion of the columns represents rural miles traveled, while the red portion represents urban miles traveled.

Total miles traveled increased rather steadily from 1980 until 2007, when the Great Recession began. They bottomed in 2009, and have resumed increasing unsteadily since then. They reached a new all-time high in 2014 of 3,025,656,000,000 miles. That’s 3 trillion miles. No wonder we create a lot of pollution and carbon dioxide!

More than 2/3 of the miles are driven in urban areas. The slope of the increase since 2009 does not appear to be as steep as before 2007, however the country had not fully recovered from the recession in 2014, and the rate may have increased since then. We cannot know from this data.

Figure 2. Data source: Department of Transportation.

Figure 2 shows the data for Missouri. The data series only goes back to 1995, and data for 1996 is missing. The miles driven in Missouri increased steadily until 2004, several years before the Great Recession. Since then it has slowly meandered up and down, peaking slightly higher at 41,901,000 miles in 2014. About 59% of the miles are driven in urban areas.

Of course, the population of the country is steadily growing, and so is the economy. More people driving and more economic activity naturally translate into more driving, so how are vehicle miles driven changing on a per capita or per dollar of gross domestic product basis?



Figure 3. Data source: Department of Transportation, Census Bureau.

Figure 3 shows the per capita data. The red columns represent the USA, and the blue columns represent Missouri. The data goes back to 1995, but 1996 is missing. For both the USA and Missouri, vehicle miles driven have mostly meandered sideways. The number of miles driven in 2014 is about 3% higher than 1995 for the USA as a whole, and 5% higher for Missouri.






Figure 4. Data source: Department of Transportation, Bureau of Economic Analysis, Federal Reserve Bank of St. Louis.

Figure 4 shows the number of vehicle miles traveled per dollar of gross domestic product. Red columns represent the USA and blue columns represent Missouri. Data for the USA extends back to 1995 (with 1996 missing), but data for Missouri only extends back to 1997. In both cases there is a clear trend: the number of vehicle miles traveled it takes to produce a dollar of gross domestic product has decreased. Nationwide, it has decreased from about 0.32 miles per dollar to about 0.17 miles. In Missouri, it has decreased from about 0.39 miles per dollar to about 0.25.

Most economists view increases in vehicle miles traveled as a sign of economic growth, and they welcome it. But in addition, it is a cause of increased pollution. I have discussed over and over in this blog the problems created by air pollution and climate change. We simply must find a way to create economic well being without sickening the planet, and ourselves in along with it. We have not yet figured that out.


For VMT: Vehicle Miles Traveled in the USA, 1980-2011. Office of Highway Policy Information, Federal Highway Administration.

For population: as I often find when using data from the Census Bureau, to get time series for the dates and locations I want, I often have to mash-up data from several different publications. For this post I used:

American FactFinder, U.S. Census Bureau.

Estimated Missouri Population, 1960-1990. Census Bureau, Historical Data.

Monthly Estimates of the United States Population: April 1, 1980 to July 1, 1999, with Short-Term Projections to November 1, 2000.

Table 1. Intercensal Estimates of the Resident Population for the United States, Regions, States, and Puerto Rico: April 1, 2000 to July 1, 2010.

Table 1. Annual Estimates of the Population for the United States, Regions, States, and Puerto Rico: April 1, 2020 to july 1, 2011.

For Missouri GDP data: Total Gross Domestic Product by State for Missouri. Economic Research, Federal Reserve Bank of St. Louis.

For United States GDP: National Income and Product Accounts Tables, Bureau of Economic Analysis. This is a data portal. Choose “Section 1 – Domestic Product and Income.” From the list that pops up, choose

Hurricane Climatology

Recent weeks have been very active for tropical storms. At one time in the Atlantic, there were 3 hurricanes (Florence, Helene, and Isaac) and 2 tropical storms (Gordon and Joyce). In the Eastern Pacific, August saw 4 tropical cyclones active at the same time (Hector, Kristy, John, and Lleana). and in September, Hurricane Hector and a new storm, Hurricane Olivia, impacted Hawaii. In the Western Pacific there have been 28 named storms. One of them, Super-Typhoon Mangkhut, caused extensive damage in the Philippenes.

Is this normal, or are tropical storms getting worse?

Tropical cyclones are rotating, organized storm systems that originate over tropical or subtropical waters. They have a center of low pressure around which they rotate that can develop into an “eye” if the storm is sufficiently intense and well organized. The thunderstorms tend to get organized into bands of thunderstorms that spiral out from the center. Even outside the thunderstorms, however, they have high sustained winds. In the Northern Hemisphere, they rotate counter-clockwise. In the Southern Hemisphere, they rotate clockwise.

Tropical cyclones are classified by the speed of their winds:

  • Tropical Depressions have maximum sustained winds of 38 mph or less.
  • Tropical Storms have maximum sustained winds of 39 to 73 mph.
  • Hurricanes have maximum sustained winds of 74 mph or higher. In the Western Pacific, hurricanes are called typhoons. In the Indian Ocean and Southern Pacific, they are called cyclones.
  • Major Hurricane have maximum sustained winds of 111 mph or higher.

Hurricanes are further classified according to the Saffir-Simpson Hurricane Wind Scale:

  • Category 1: winds 74-95 mph., capable of causing damage even to well-constructed wood frame homes.
  • Category 2: winds 96-110 mph., capable of causing damage to roofs and siding, blowing shallowly rooted trees over, and causing power loss.
  • Category 3 (major hurricane): winds 111-129 mph., capable of causing structural damage to even well-built homes, snapping or uprooting lots of trees, and causing power outages that last for days.
  • Category 4 (major hurricane): winds 130-156 mph., capable of causing catastrophic damage, ripping whole roofs off houses or blowing down walls. Regions impacted may be uninhabitable for weeks or months.
  • Category 5 (major hurricane): winds 157 mph. or higher, capable of destroying most houses, blowing down most trees, cutting off access to whole regions, and making whole regions uninhabitable for weeks or months.

Figure 1. Source: National Oceanographic and Atmospheric Administration

Tropical cyclones generally originate near the equator. Figure 1 shows the major regions where tropical cyclones tend to form, and their typical paths. I’m not sure what sends so many of them up the U.S. coast, rather than coming ashore. Perhaps it is the jet stream, or the Gulf Current, or the Mid-Atlantic High.





Figure 2: Cyclone Formation by Time of Year. Source: National Oceanic and Atmospheric Admiinistration.

Figure 2 divides the year into 10-day intervals, and counts the number of tropical storms, hurricanes, and major hurricanes that form in the Atlantic Basin per 100 years during each interval. August and September are “hurricane season,” and the 10 days from September 10-19 are the peak. This chart would seem to indicate that during that interval, between 90 and 100 tropical storms originate every 100 years in the Atlantic Basin. That averages out to about 0.9-1.0 per year. Thus, it would appear that the last several weeks have been unusually active.



Figure 3. Data source: National Oceanographic and Atmospheric Administration

Figure 3 shows the number of tropical storms, hurricanes, and major hurricanes that have formed in the Atlantic Basin annually since 1851. The blue area shows the number of tropical storms, the red area shows the number of hurricanes, and the green area shows the number of major hurricanes. To each series I have fitted a polynomial regression line.

First, the variation between years is large for all of the series. Second, there are more tropical storms than hurricanes, and more hurricanes than major hurricanes. Third, all three series show an increasing trend over time. There are more tropical storms than there used to be, more hurricanes, and more major hurricanes. HOWEVER, in viewing these trends, one must keep in mind that today we have weather satellites, air travel, and a good deal more shipping density across the Atlantic Ocean. It is quite possible that some storms went undetected or unmeasured in the past, but that is no longer the case. Thus, the observed change could easily be due to better observations, not a real increase in the number of storms. I don’t have the ability to make that correction, but The Fifth Assessment Report of the Intergovernmental Panel on Climate Change concluded that evidence for an increase in the number of tropical cyclones is not robust.

Figure 4. Data source: Wikipedia Contributors, 9/14/18.

As noted above, tropical cyclones can form in 8 basins around the world. One might ask which produces the most severe storms? Figure 4 shows the data. All of the basins have produced storms with sustained winds above 150, but the highest ever recorded was 215 mph. in Hurricane Patricia in the Eastern Pacific in 2015. In terms of lowest central pressure, the lowest ever recorded was in Typhoon Tip in the Western Pacific in 1979.

One might also ask which basin produces the most severe storms. Record keeping began in a different year in each basin, however, there appears to be a clear answer: the Western Pacific. Counting only storms with a minimum central pressure below 970 kPa, this basin has produced more than twice as many as any other basin.

Large cyclonic storms most often form in the tropics during hurricane season, but they don’t have to. For instance, the so-called “Perfect Storm” (they made a movie about it starring George Clooney) was a 1991 storm that formed in the Atlantic off the coast of Canada on October 29. It developed into a Category 1 hurricane, with a well defined eye, not dissipating until after November 2. Similarly, the remnants of Tropical Storm Rina (2017) travelled north across the Atlantic, crossed the British Isles, and crossed Central Europe. Entering the Mediterranean Sea, it re-strengthened into a tropical storm, now called Numa, which developed an eye and other characteristics typical of a hurricane. It’s strength peaked on November 18, with maximum sustained winds of 63 mph., not quite hurricane strength, but close.


Hartmann, D.L., A.M.G. Klein Tank, M. Rusticucci, L.V. Alexander, S. Brönnimann, Y. Charabi, F.J. Dentener, E.J. Dlugokencky, D.R. Easterling, A. Kaplan, B.J. Soden, P.W. Thorne, M. Wild and P.M. Zhai, 2013: Observations: Atmosphere and Surface. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Stocker, T.F., D. Qin, G.-K. Plattner, M. Tignor, S.K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex and P.M. Midgley (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA.
NASA Space Place. 2018. How Do Hurricanes Form. Downloaded 9/18/2018 from

National Oceanic and Atmospheric Administration. 2018. Tropical Cyclone Climatology. Viewed online 9/18/2018 at

Wikipedia contributors. (2018, July 21). Cyclone Numa. In Wikipedia, The Free Encyclopedia. Retrieved 19:26, September 18, 2018, from

Wikipedia contributors. (2018, September 14). List of the most intense tropical cyclones. In Wikipedia, The Free Encyclopedia. Retrieved 22:23, September 18, 2018, from

Very Dry vs. Very Wet Months in the United States, 2018 Update

I’ve reported on drought in the American West many times in this blog. What about the country as a whole?

One way of looking at this question is by asking each month how much of the country has been very dry, and how much as been very wet? By very dry, I mean that the amount of precipitation for that month falls in the lowest 10% for that month in the historical record. By very wet, I mean that the amount of precipitation for that month falls in the highest 10% for that month.

The National Oceanic and Atmospheric Administration keeps this data. They measure the precipitation in every county in the country, and calculate what percent of the country was very dry, and what percent was very wet. They have data for every month going back to January of 1895.

Figure 1. Data source: National Centers for Environmental Information.

Figure 1 shows the monthly data for every month all the way back to January, 1895. Blue bars represent the percentage of the country that is very wet. Red bars represent the percentage that is very dry. (To keep the blue and red bars from obscuring each other, I multiplied the dry percentage by -1, thereby inverting it on the chart.) I dropped trend lines on both data series. As you can see, there is considerable variation from year-to-year. There is a slight trend – hardly noticeable – towards more very wet months and fewer very dry months. But it is small, and the yearly variation is much greater than the trend.



Figure 2. Data source: National Centers for Environmental Information.

Figure 2 shows the same data, but it beings in January, 1994.. I constructed this chart to see whether the most recent 25 years look different than the record as a whole. Again, blue bars represent very wet months, and the red bars represent very dry ones. I dropped linear trend lines on both data series, as before. The yearly variation is again larger than the trends. There appears to be virtually no trend in the number of very dry months. There is a small trend towards increasing number of very wet months. It appears a bit larger than did the one for the whole time period, but even so, it is tiny compared to the yearly variation.



Figure 3. Data source: National Centers for Environmental Information.

It’s a bit hard to read the two data series on opposite sides of the zero line, so I constructed Figure 3. For each month it shows the percentage of the country that was very dry minus the percentage that was very wet. By doing my subtraction that way, numbers above zero mean that more of the country was very dry than very wet, and numbers below zero mean that more of the country was very wet. I dropped a linear trend on the data (red), and I also dropped a 15-year moving average on it. The chart shows that, as we saw in Figure 1, there is a slight trend towards fewer very dry months and more very wet ones. The variation is much larger than the trend, whether one looks at the monthly data, or the yearly.

This data differs from other drought data I report. Those reports focus on the Palmer Drought Severity Index, an index intended to represent soil moisture. Soil can dry out because there is little overall precipitation, or because there are longer periods between precipitation events, or because the temperature is warmer. This data would tend to indicate that regions of the country with very little precipitation may be decreasing very slightly, very slowly. Regions with very much precipitation may be increasing. This trend would be consistent with consensus predictions regarding climate change, where overall precipitation is not expected to change, but the number of heavy precipitation events is expected to increase.


National Centers for Environmental Information, National Oceanographic and Atmospheric Administration. U.S. Percentage Areas (Very Warm/Cold, Very Wet/Dry). Downloaded 9/1/2018 from

Air Pollution Is a Killer

Air pollution is a killer. It is responsible for more deaths than better known risk factors, such as alcohol use, physical inactivity, or unsafe sex.

Risk factors don’t usually kill you directly. Almost nobody steps off an airplane in Delhi or Beijing and dies from inhaling a breath of polluted air. Instead, risk factors make it more likely that you will get a disease, and that disease will either kill you or disable you. Using this logic, it is possible to say that all deaths are “caused” by some combination of risk factors, which lead to the specific diseases or events that kill the individual.

Figure 1. Number of deaths worldwide attributable to 17 risk factors. Source: Institute for Health Metrics and Evaluation, 2018.

Public health officials estimate the number of deaths that result from (are caused by) the various risk factors. For instance, if a person has high blood pressure and high blood glucose, and that person dies at age 68 instead of age 78, which was the person’s life expectancy, then that person lost 10 years of expected life. Public health officials try to figure out how many of those 10 lost years were attributable to the high blood pressure, and how many to the high blood glucose. They then assemble that data into a statistic that represents how many deaths per year were caused by each.

Figure 1 shows the number of global deaths per 100,000 in population that are attributable to the most important risk factors. Air pollution is 4th, behind high blood pressure, dietary risk (unhealthy food), and tobacco use. The total number of deaths attributed to air pollution in 2016 was 6.1 million, or 9.6% of all deaths from all risk factors.

The primary diseases to which air pollution contributed were heart disease, stroke, chronic lung disease (including asthma), and respiratory infections. Air pollution was responsible for more deaths than many better known risk factors such as high blood glucose, high cholesterol, alcohol and drug use, malnutrition, and unsafe sex (HIV/AIDS, etc.) In fact, despite all the publicity that unsafe sex gets, only 1.2 million deaths were attributed to it worldwide in 2016. Don’t get me wrong, 1.2 million deaths are a terrible thing, but air pollution kills more than 5 times as many.

Figure 2. Disability-adjusted life years (DALYs) attributable worldwide to 17 risk factors, 2016. Source: Institute for Health Metrics and Evaluation.

Risk factors don’t have to kill you, they can also cause disability. A person with a disability may live for many years before dying, trying to cope with that disability every day of every year. Thus, in public health terms, a disability incurred early in life has somewhat different implications than a disability incurred late in life. The Global Burden of Disease estimates not only the number of people with a disability, but multiplies it by the length of time they will have to live with it. This estimate is called the disability-adjusted life years (DALY). Air pollution is the 5th most important risk factor for DALYs, with malnutrition having vaulted into the lead position. (Figure 2)





Figure 3. Deaths per 100,000 attributable to air pollution, by country, 2016. Source: Institute for Health Metrics and Evaluation.

Figure 3 shows a map of the world onto which the number of deaths per 100,000 from air pollution has been charted. North Korea loses more of its population to air pollution than any other nation, followed by the Central African Republic, Georgia (the country, not the state), and Afghanistan. This may surprise many readers, as we often think of air pollution being a function of industrial emissions in large cities, but in many developing nations, this is not the case. Readers of this blog know that particulate matter is the most dangerous of the 6 criterion pollutants. In developing countries, the people often use fires inside the home for cooking and warmth. The fires are smokey, and the homes are poorly ventilated, resulting in high levels of particulate air pollution. In addition, blowing fine mineral particles play an important role in some desert countries.

The United States has a death burden from air pollution of 32.6 per 100,000: low, but not one of the lowest in the world.

The above data looks at number of premature deaths caused by air pollution. Another way to look at the data is by asking how much air pollution shortens an average person’s life. Just such a study recently appeared (Apte, et al., 2018). Supplementary data associated with that article estimated the average life span in the United States to be 78.8 years, and PM2.5 will take about 4-1/2 months off of the average life expectancy. That was 22nd best in the world. Sweden had the lowest loss of life expectancy from PM2.5, about 1/3 that of the USA, while Bangladesh had the highest, almost 5 times that of the USA.

So, what diseases has air pollution been implicated in? We know from the above that it is known to cause disability and contribute to early death. We know that it contributes to the development of heart attack, stroke, chronic lung disease (including asthma), and respiratory infection. These relationships have been well documented, and are strong. But air pollution has also been implicated in diseases you wouldn’t expect. It has been implicated in a host of neuropsychological conditions, from increased signs of inflammation in the brain, to increased rates of Parkinson’s disease, to reduced IQ, to increased risk of ADHD, to increased rates of autism spectrum disorders, to reduced motor functioning. It has been implicated in hastening cognitive decline late in life. It has been implicated in the development of obesity and type 2 diabetes.

My impression from the studies of air pollution’s relationship to mental functioning, obesity, and diabetes is that their conclusions should not be heavily relied upon, as confounding variables undercut the comparisons the authors try to make. Even when their findings hold up, air pollution seems to play only a small role in most of these diseases. Many of the studies enrolled large numbers of subjects, making it possible to find statistically significant results with small differences of questionable importance. This is sometimes hidden from view by reliance on the relative risk statistic. Relative risk compares the risk in one condition with the risk in another. For instance, suppose 2 people out a million of develop a disease. If people are exposed to air pollution, however, then 3 people out of a million develop the disease. The relative risk is 3/2 = 1.5, or 50% higher. That sounds really significant. But you have added only one case per million people, and in total only 3 people out of a million will get the disease. If you look at it that way, then it doesn’t seem so important. Investigators can make some pretty insignificant results sound mighty important by reporting relative risk and not reporting other statistics. Thus, air pollution may play a role in these conditions, but I think the jury is still out, and we will have to await further study to be sure of how important a role.

Don’t let the fact that air pollution may play rather minor roles in causing diseases such as Parkinson’s, Alzheimers, autism, or diabetes confuse you. It is strongly linked to heart attack, stroke, chronic lung disease, and asthma, and is a significant risk factor worldwide.

This brings me to the end of this update on the Air Quality Index data for 2017. Missouri has made large strides in improving air quality. It is one of the few good news stories I get to report on. It is important that we continue to make progress, however, as air pollution is an important risk factor that causes or contributes to a great deal of death and disability around the planet.


Alderete, Tanya L., Rima Habre, Claudia M. Toledo-Corral, Kiros Berhane, Zhanghua Chen, Frederick W. Lurmann, Marc J. Weigensberg, Michael I Goran, and Frank D. Gilliland. “Longitudinal Associations Between Ambient Air Pollution With Insulin Sensitivity, ß-Cell Function, and Adiposity in Los Angeles Latino Children.” Diabetes, 66, (7), pp. 1789-1796.

Apte, Joshua S., Michael Brauer, Aaron J. Cohen, Majid Ezzati, and C. Arden Pope, III. 2018. “Ambient PM2.5 Reduces Global and Regional Life Expectancy. Environmental Science & Technology Letters. Article ASAP. DOI:10.1021/acs.estlett.8b00360. Data downloaded 8/27/2018 from

Berhane, Kiros, Chih-Chieh Chang, Rob McConnell, James Gauderman, Edward Avol, Ed Rapapport, Robert Urman, Fred Lurman, and Frank Gilliland. “Association of Changes in Air Quality With Bronchitic Symptoms in Children in California, 1993-2012.” Journal of the American Medical Association. 315. (14), pp. 1491-1501.

Caiazzo, Fabio, Aksay Ashok, Ian A. Waitz, Steve H.L. Yim, Steven R.H. Barrett. 2013. “Air Pollution and Early Deaths in the United States. Part 1: Quantifying the Impact of Major Sectors in 2005.” Atmospheric Environment. 79 pp. 198-208.

Costa, Lucio G., Toby B. Cole, Jacki Coburn, Yu-Chi Chang, Khoi Dao, and Pamela J. Roque. “Neurotoxicity of Traffic-Related Air Pollution.” Neurotoxicology, 59, pp. 133-139.

Dendup, Tashi, Xiaoqi Feng, Stephanie Clingan, and Thomas Astell-Burt. 2018. “Environmental Risk Factors for Developing Type 2 Diabetes Mellitus: A Systematic Review.” International Journal of Environmental Research and Public Health. 15 (78); doi:10.3390/ijerph15010078.

Di, Qian, Lingzhen Dai, Yun Wang, Antonella Zanobetti, Christine Choirat, Joel D. Schwarts, and Francesca Dominici. 2017. “Association of Short-Term Exposure to Air Pollution With Mortality in Older Adults. Journal of the American Medical Association. 318, (24), pp. 2446-2456.
Dockery, Douglas W., Arden Pope III, Xiping Xu, John D. Spengler, James H. Ware, Martha E. Fay, Benjamin G. Ferris, and Frank E. Speizer. 1993. “An Association Between Air Pollution and Mortality in Six U.S. Cities.” New England Journal of Medicine, 329 (24), pp. 1753-1759.

Guxens, Monica, and Jordi Sunyer. 2012. “A Review of Epidemiological Studies on Neuropsychological Effects of Air Pollution.” Swiss Medical Weekly. 141: w13322.

Health Effects Institute. 2018. State of Global Air 2018. Special Report. Boston, MA: Health Effects Institute.

Institute for Health Measurement and Evaluation. GBD Compare/Vix Hub.

Jerrett, Michael, Rob McConnell, C.C. Roger Chang, Jennifer Wolch, Kim Reynolds, Frederick Lurmann, Frank Gilliland, and Kiros Berhane. 2010. “Automobile Traffic Around the Home and Attained Body Mass Index: A Longitudinal Cohort Study of Children Aged 10-18 Years.” Preventive Medicine. 50 (0 1), pp. S50-S58.

Jerret, Michael, Rob McConnell, Jennifer Wolch, Roger Chang, Claudia Lam, Genevieve Dunton, Frank Gilliland, Fred Lurmann, Talat Islam, and Kiros Berhane. 2014. “Traffic-Related Air Pollution and Obesity Formation in Children: A Longitudinal, Multilevel Analysis.” Environmental Health. 13, 49.

Miller, Kristin A., David S. Siscovick, Lianne Sheppard, Kristen Shepherd, Jeffrey H. Sullivan, Garnet L. Anderson, and Joel D. Kaufman. 2007. “Long-Term Exposure to Air Pollution and Incidence of Cardiovascular Events in Women.” New England Journal of Medicine. 356 (5), pp. 447-458.

Oudin A, Forsberg B, Nordin Adolfsson A, Lind N, Modig L, Nordin M, Nordin S, Adolfsson R, Nilsson LG. 2016. “Traffic-related air pollution and dementia incidence in northern Sweden: a longitudinal study.” Environ Health Perspectives. 124:306–312; http://dx.doi. org/10.1289/ehp.1408322.

Power, Melinda C., Sara D. Adar, Jeff D. Yanosky, and Jennifer Weuve. 2016. “Exposure to Air Pollution as a Potential Contributer to Cognitive Function, Cognitive Decline, Brain Imaging, and Dementia: A Systematic Review of Epidemiologic Research. Neurotoxicology. 56, pp. 235-253.
Ritz, beate, Pei-Chen Lee, Johnni Hansen, Christina Funch Lassen, Mattias Ketzel, Mette Sorensen, and Ole Raaschou-Nielsen. “Traffic-Related Air Pollution and Parkinson’s Disease in Denmark: A Case-Control Study.” Environmental Health Perspectives. 124 (3), pp. 351-356.

Samet, Jonathan M., Francesca Dominici, Frank C. Curriero, Ivan Coursac, and Scott L Zeger. 2000. “Fine Particulate Air Pollution and Mortality in 20 U.S. Cities, 1987-1994.” New England Journal of Medicine. 343, (23), pp. 1742-1749.
Schwartz, Joel, Marie-Abele Bind, and Petros Koutrakis. 2016. “Estimating Causal Effects of Local Air Pollution on Daily Deaths: Effect of Low Levels.” Environmental Health Perspectives. DOI:A10:1289/EHP232.
Wellenius, Gregory A., Mary R. Burger, Brent A. Coull, Joel Schwartz, Helen H. Suh, Petros Koutrakis, Gottfried Schlaug, Diane R. Gold, Murray A. Mittleman. 2012. “Ambient Air Pollution and the Risk of Acute Ischemic Stroke. Archives of Internal Medicine. 172, (3), pp. 229-234.
Weuve, Jennifer, Robin C. Puett, Joel Schwartz, Jeff D. Yanosky, Francine Laden, and Francine Grodstein. 2012. “Exposure to Particulate Air Pollution and Cognitive Decline in Older Women.” Archives of Internal Medicine. 172, (3), pp. 2190227.
White, Laura F., Michael Jerrett, Jeffrey Yu, Julian D. Marshall, Lynn Rosenberg, and Patricia F. Coogan. 2016. “Ambient Air Pollution and 16-Year Weight Change in African-American Women.” American Journal of Preventive Medicine. 51, (4), e99-e105.

Ozone Was the Most Important Air Pollutant in Missouri in 2017

The Air Quality Index is a measure that combines the level of pollution from six criterion pollutants: ozone (O3), sulphur dioxide (SO2), nitrous oxide (NO2), carbon monoxide (CO), particulate matter smaller than 2.5 micrometers (PM2.5), and particulate matter between 2.5 and 10 micrometers (PM10). For a brief discussion of these pollutants, see Air Quality Update 2017.

Figure 1. Data source: Environmental Protection Agency.

Figure 1 shows the percentage of days for which each of the criterion pollutants was the most important one. The chart combines all 24 counties together. Since 2009 ozone has been the most important pollutant on more days than any of the other pollutants, and it extended its “lead” in 2017. PM2.5 was the most important pollutant on the second highest number of days. Since 2007, however, the percentage of days on which it was the most important pollutant has been trending lower. One or the other of these two pollutants was the most important on 77% of all days statewide.

(Click on figure for larger view.)

Thirty years ago, ozone was a much less important pollutant than it is now. In 1983, it was the most important pollutant on fewer than 30% of the days statewide, but in 2017 it was the most important pollutant on 54% of the days. While we need ozone in the upper atmosphere to shield us from ultraviolet radiation, at ground level it is a strongly corrosive gas that is harmful to plants and animals (including us humans). We don’t emit it directly into the air. Rather, it is created when nitrogen oxides and volatile organic compounds (vapor from gasoline and other similar liquids) react in the presence of sunlight. These pollutants are emitted into the atmosphere by industrial facilities, electric power plants, and motor vehicles.

The second most important pollutant was PM2.5 (23% of days in 2017, sharply reduced from 2016). These tiny particles were not recognized as dangerous until relatively recently, though now they are thought to be the most deadly form of air pollution. I can’t find anything that says so specifically, but I believe the zero readings in 1983 and 1993 means that PM2.5 wasn’t being measured in Missouri, not that it wasn’t a significant pollutant back then. The EPA significantly tightened its regulations for PM2.5 in 2012. In 2015, no Missouri county was determined to be noncompliant with the new standards, however data gaps from sensors just across the Mississippi River prevented determination of whether pollution from Missouri was causing a violation of standards in the Illinois side of the metro area. Thus, the counties of Franklin, Jefferson, St. Charles, St. Louis, and St. Louis City were all called “unclassified.” Road vehicles, industrial emissions, power plants, and fires are important sources of PM2.5.

Sulphur dioxide used to be by far the most important pollutant. While it has not been eliminated and was still the most important pollutant on some days, good progress was made on reducing SO2 emissions: 6% of days in 2011. Since then, however, its relative importance has been on the increase, and in 2017 it was the most important pollutant on 16% of days. For a discussion of the role of SO2 in background air pollution, see this post. For my most recent update on the concentration of sulfur dioxide in background pollution, see here.

Don’t forget that Figure 1 does not show the levels of the six pollutants, it shows the percentage of days on which each was the most important. As previous posts have clearly shown, air quality is better. As we have reduced some types of air pollution, apparently, other types have increased in relative importance.

Missouri has come a long way in improving its air quality. To a large extent, it did so in two ways: by kicking some of its coal habit (replacing coal with natural gas and oil as sources of energy), and by requiring large coal-burning power plants to install pollution control equipment. We have more work to do, especially with regard to O3 and PM2.5, but it has been a significant environmental success story.

In the next post, I will discuss the health effects of air pollution. Spoiler alert: air pollution isn’t good for you!


Environmental Protection Agency. Air Quality Index Report. This is a data portal operated by the EPA. Data downloaded on 7/31/2018 from

Missouri Department of Natural Resources. Missouri State Implementation Plan: Infrastructure Elements for the 2012 Annual PM2.5 Standard. Viewed online 3/30/2017 at

Few Unhealthy Air Days in Missouri Counties in 2017

It is one thing to ask whether a county’s air quality is good, and another to ask if it is so bad that it is unhealthy. In the previous post, I reported on the percentage of days during which air quality was in the good range in 24 Missouri counties. This post focuses on the percentage of days with unhealthy air quality.

I looked at data from the EPA’s Air Quality Index Report for 24 Missouri counties. The data covered the years 2003-2017, plus the years 1983 and 1993 for a longer term perspective. For a fuller discussion of air quality and the data used for this post, and a map of the 24 counties, see my post Air Quality Update, 2017.

Figure 1. Data source: Environmental Protection Agency.

The EPA data distinguishes 4 levels of unhealthy air: Unhealthy for Sensitive Individuals, Unhealthy, Very Unhealthy, and Hazardous. No Missouri county was reported to have Very Unhealthy or Hazardous air quality for any of the years I studied. Figure 1 shows the percentage of monitored days for which air quality was either Unhealthy for Sensitive Individuals, or Unhealthy. The top chart shows a group of counties along the Mississippi River north or south of St. Louis. The middle chart shows a group of counties in the Kansas City-St. Joseph region. The bottom chart shows a group of widely dispersed counties outside of the other two areas. For the locations of the counties, see here.

(Click on chart for larger view).

The percentage of unhealthy air days was 1% or below for all Missouri counties . There were no unhealthy air days at all in 13 of the 24 counties, and no county had more than 4 unhealthy AQI days. Compared to 2016, 4 counties showed very small increases, and 9 had decreases. Compared to 1983, the total number of unhealthy air days across all counties decreased from 490 to 21, a 96% decline. St. Louis City, St. Louis County, Iron County, Jackson County, and Jackson County, in that order, were the counties in 1983 with the highest number of unhealthy air days. By 2017, those four counties had decreased the number of unhealthy air days by 98%, 99%, 97%, 100%, and 100%, respectively.

Well done! We have more work to do before all Missourians can breath truly good quality air every day, but the decrease in unhealthy days is amazing, just amazing. In the next post, I will discuss the most important air pollutants in Missouri. After that, I will discuss the health effects of air pollution, and you will understand why the reduction in unhealthy air days is such an important achievement.


Environmental Protection Agency. Air Quality Index Report. This is a data portal operated by the EPA. Data downloaded on 7/31/2018 from