Missouri Has A Big, Unused Wind Resource

Figure 1. Source: Department of Energy.

Missouri is blessed with a significant wind energy resource, less than 1% of which is being exploited, according to data from the United States Department of Energy. In all of the following data, the speed of the wind has been measured at 80 meters above the ground, the typical tower height of a wind turbine.

Figure 1 shows the potential wind capacity of the United States by state. Texas has more wind potential than any other state, at 1.3 million megawatts. Missouri ranks 16th in most potential, with 279,000 megawatts.





Figure 2. The map shows the estimated mean annual wind speeds at an 80-m height (262 feet). Source: Department of Energy.

Figure 2 shows where Missouri’s wind resource is located. According to the Department of Energy, an average annual wind speed of 6.0 meters per second is required to constitute a viable wind resource. On the map at right red represents average annual wind speeds of 7 – 7.5 meters/second, or about 16 mph, while orange represents average annual wind speeds of 6.5 – 7 meters/second. Missouri’s wind resource is located in a broad arc across the northern and western portions of the state.







Figure 3. Source: Department of Energy.

Figure 3 shows installed wind power capacity by state. Texas again leads the way, with 24,899 megawatts of installed capacity. Iowa is second with 8,422 megawatts, and California is third with 5,885 megawatts. Missouri has 959 megawatts of installed capacity; that equals about 0.3% of our wind power potential.








Figure 4. Source: United States Geological Survey.

Figure 4 shows the location of wind turbines installed in Missouri. You can see that they are all located in the northwest quadrant of the state.











Figure 5. Source: Department of Energy.

Figure 5 shows the energy mix on the electrical grid in Missouri. About 73% of our electricity is generated by burning coal, about 13% is generated in a nuclear plant, only 8% is generated from natural gas, and only 3.58% from wind power. Given that wind power represents 3+% of our energy mix, but only 0.3% of our wind power potential, Missouri could generate much more of its electricity using wind power.

Figure 5 also shows the amount of electricity generated by wind power in Missouri by year. You can see that between 2008 and 2012 it grew, but then it plateaued until 2016, and has grown since then.

Climate change presents many challenges. Among the largest is transitioning to energy resources that don’t release carbon dioxide. The data above shows that Missouri has significant potential, but we have only begun to exploit it.


United States Department of Energy. 2019. Wind Energy in Missouri. Downloaded 4/22/2019 from

United States Geological Service. 2018. The U.S. Wind Turbine Database. Downloaded 4/22/2019 from

Missouri Weather-Related Deaths, Injuries, and Damages in 2017

Figure 1. Data source: Office of Climate, Water, and Weather Services, National Weather Service.

Damage from severe weather in Missouri shows a different pattern than does damage nationwide. As Figure 1 shows, the cost of damage from hazardous weather events in Missouri spiked in 2007, then spiked even higher in 2011. Since then, it has returned to a comparatively low level. The bulk of the damage in 2011 was from 2 tornado outbreaks. One hit the St. Louis area, damaging Lambert Field. The second devastated Joplin, killing 158, injuring 1,150, and causing damage estimated at $2.8 billion. The damages in 2007 came primarily from two winter storms, one early in the year, one late. In both cases, hundreds of thousands were without power, and traffic accidents spiked.



Office of Climate, Water, and Weather Services, National Weather Service.

Figure 2 shows deaths and injuries in Missouri from hazardous weather. Deaths are in blue and should be read on the left vertical axis. Injuries are in red and should be read on the right vertical axis. The large number of injuries and deaths in 2011 were primarily from the Joplin tornado. In 2006 and 2007, injuries spiked, but fatalities did not. The injuries mostly represented non-fatal auto accidents from winter ice storms. The fatalities in 1999 resulted from a tornado outbreak.

I understand the trends in both figures this way: once in a while, Missouri has been struck with catastrophic weather events. They cause lots of deaths and a lot of damage, at a whole different scale from years with no catastrophic weather event. In years with no such event, weather-related deaths in Missouri have been around 40 or fewer, and injuries have been roughly 400 or fewer. Damages in such years have been about $150 million or less. In years with catastrophic weather events, the totals can be much higher.

2017 was a year in which Missouri saw no weather disasters that caused such high damages, or killed or injured so many people. That does not mean that Missouri was unaffected, however. The state was included in several billion-dollar weather disasters, the most costly of which was probably the flood of April 25-May 7. That was a historic flood for many of the communities that were affected.

The Missouri data covers fewer years than the national data discussed in my previous post. It also covers all hazardous weather, in contrast to the national data, which covered billion dollar weather disasters only. In addition, for some reason the Missouri data for 2018 has not yet been posted.

While the national data shows a clear trend towards more big weather disasters, Missouri’s data does not. The Missouri data seems to reflect the kind of disaster and where it occurred. Tornadoes, if they hit developed areas, cause injuries, deaths, and lots of damage. Floods cause fewer injuries and deaths; damage can be significant, but it is limited to the floodplain of the river that flooded. Ice storms affect widespread areas; damages come mostly through loss of the electrical grid, which can cause widespread economic loss and from car crashes, which cause many injuries but fewer deaths.


Office of Climate, Water, and Weather Services, National Weather Service. 2016. Natural Hazard Statistics. Data for Missouri downloaded at various dates from

CPI inflation Calculator. 2019. 2017 CPI and Inflation Rate for the United States. Data downloaded 4/6/2019 from
National Centers for Environmental Information. 2019. Billion-Dollar Weather and Climate Disasters: Table of Events. Viewed online 4/6/2019 at

Descriptions of specific weather events, if they are large and significant, can be found on the websites of the Federal Emergency Management Administration, the Missouri State Emergency Management Agency, and local weather forecast offices. However, in my experience, the best descriptions are often on Wikipedia.

Natural Disasters Down in USA in 2018

The number of natural disasters in the USA in 2018 declined from 2017, and the amount of damage they did also declined.

Figure 1. Source: National Centers for Environmental Information, 2019.

Since 1980, there have been 241 severe weather and/or climate events in the USA that have caused damages exceeding $1.6 trillion dollars. The most important of these are the “billion-dollar disasters,” events that caused damages in excess of $1 billion. Figure 1 shows the data. The columns represent the number of events, with each color representing a different type of event. The black line represents a moving 5-year average number of events. They gray line represents the cost of damages, in billions of dollars.

In 2018, there were 14 billion-dollar disasters. That is more than double the long-term average. The 4 years with the most billion dollar disasters have all occurred in the last 8 years, and the last 3 years have ranked #1 (tie), #3, and #4. The last 3 years have been significantly higher than any other years except 2011. The increase comes primarily from severe storms, a category that excludes hurricanes and tropical cyclones, flooding and winter storms. These are tornadoes, thunderstorms, hail, and similar kinds of storms.

The estimated CPI-adjusted losses in 2018 were $91 billion. That’s quite a chunk of change, but it much less than the costs in 2017 ($312.7 billion), 2005 ($220.8 billion), or 2012 ($128.6 billion). 2017 was a terrible year (see here): Hurricanes Harvey, Irma, and Maria struck in 2017, and not only was it far-and-away the most costly year, it also tied for the highest number of disasters. Hurricane Katrina struck in 2005, and Hurricane Sandy struck in 2012. Hurricane Katrina is still the single event that caused more damage than any other, followed by Hurricanes Harvey and Maria. Last year, the most costly natural disaster was the series of wildfires in California, especially the Camp Fire. The fires caused an estimated $24 billion in damage.

Looking at Figure 1, starting somewhere around 2005, the number of billion-dollar disasters starts to trend upward. The amount of damage does, too, though the variation from year-to-year is much greater. The National Centers for Environmental Information, which keeps this data, factors the Consumer Price Index into it, so the change is probably not due to inflation.

Three factors probably account for most of the increase. First, climate change has caused an increase in the amount of energy in the atmosphere, energy that is available to power more and bigger storms. I once calculated that the increased radiative forcing from climate change was equal to the energy output of 1.6 million nuclear power plants. (See here) That’s a lot of energy available to power storms. Second, climate change has caused an increase in droughts throughout the western United States. Even in years like this one, when there has been an abundance of winter snow, the warm temperatures cause the snow to melt earlier in the spring, and they dry out the land faster during the summer. The result is a tinder box, perfect conditions for huge wildfires. And finally, we keep putting ourselves in harm’s way. Development has increased along the Atlantic and Gulf Coasts, where it is in the path of any hurricane that comes ashore. Development has also increased along the fringes of forests, where it is vulnerable to wildfire. And even in the middle of the country, sprawl has increased the built-up area, making tornadoes more likely to grind over it, as opposed to farmland.

Missouri was in the region damaged by some of the big weather events of 2018, so the next post will look at how we fared here in the Show Me State.


The National Centers for Environmental Information Billion-Dollar Weather and Climate Disasters portal has 5 pages available, and I used them all for this post: Overview, Mapping, Time Series, Summary Stats, and Table of Events. Downloaded and viewed online 4/3/2019 at

Americans Still Think the Environment Is Worth Saving

If you listen to the national media, especially the conservative media, you might think that interest in environmental protection has gone the way of the dinosaur, swept away by a national consensus to focus on economic growth at all costs.


Figure 1. Source: Gallup Inc.

Gallup Inc. is a global analytics and analysis organization that conducts what we all know as Gallup Polls. One of the questions they regularly include in their polls is “With which one of these statements do you most agree – protection of the environment should be given priority, even at the risk of curbing economic growth (or) economic growth should be given priority, even if the environment suffers to some extent?” The results are shown in Figure 1. It is a chart from Gallup showing the percent of respondents choosing the environment over the economy, and vice-versa. To it I have added a line showing republican and democratic presidencies (red and blue, respectively).

In 1990, people chose environmental protection over economic growth 71% to 19%. That is a huge majority! Support for the environment weakened starting in 2000. By 2010, respondents chose the economy over the environment 53% to 38%. Since then, support for the environment has rebounded, with the most recent figures showing respondents picking the environment over the economy by 57% to 35%. (The percentages don’t sum to 100% because some people answer that they don’t have a preference, or they decline to answer the question.)

Even 57% to 35% is a large majority – a difference of 32%: for every 10 people who chose the economy, 17 chose the environment. No American president has ever been elected by such a margin. Warren Harding comes closest (!), with a margin of 26%. Some recent “landslides” involved margins of 23% (Nixon over McGovern, 1972, and Johnson over Goldwater, 1964). Even the famous Reagan “landslide” of 1984 (Reagan over Mondale) was only 18%. The current Tweeter-in-Chief, due to a quirk in the electoral college system, was elected with a minority of the vote (-3%).

Figure 2. Source: Gallup, Inc.

Since 1998, Gallup has also asked respondents “Is the seriousness of global warming generally exaggerated, generally correct, or is it generally underestimated?” Figure 2 shows the results. The dark green line shows the number of people who think the seriousness of global warming is exaggerated. The dark black line shows the number of people who think its seriousness is generally correctly portrayed. The gray line shows the percent of people who think it is underestimated.

A minority of people think that the seriousness of global warming is correctly represented – that is constant across the whole time period. For much of the period, more people thought its seriousness was exaggerated than thought it was underestimated. In recent years, that has shifted, and now more people think it is underestimated (41%) than think it is exaggerated (33%).

Where would I fit on that last question? I think I would refuse to answer it. I feel that global warming is one of the most significant challenges facing humanity. But I feel that it is a slow-motion catastrophe. Just as a simple example: if you go to Miami Beach or Lower Manhattan in 100 years, you are likely to find they are very different places, struggling to cope with flooding, sometimes more severe, sometimes milder. However, that is a change that will unfold over the entire coming century, giving people lots of time to adapt and adjust. Thus, those who say the danger is fabricated are underestimating it. On the other hand, those who say an existential catastrophe is imminent are exaggerating.

What is true is that the carbon that goes into the atmosphere stays there for nearly a century. Thus, if we don’t act quickly, we most likely doom ourselves to a change that will unfold over decades, and which we will be impotent to prevent.

As the two figures above illustrate, environmental concerns have NOT been swept away our current president. Rather, he is acting to prevent Americans from addressing problems that they feel are important, even if it involves some economic sacrifice.


Gallup Inc. 2018. In Depth: Topics A to Z: Environment. Downloaded 3/27/2019 from

Trends Over Time

The last 2 posts have reported specifics on some of the toxic chemicals released into the environment in Missouri in 2017. This post will broaden the view and discuss trends in toxic releases over time.

Figure 1. Source: Environmental Protection Agency, 2019b.

To some extent it is difficult to follow trends over time because of changes in the TRI program. Figure 1 shows the national trend in chemicals reported to the EPA over time, and it labels the years in which major changes have occurred in the program. The light blue and sandy yellow show the total amount of toxic chemicals reported, the dark blue and orange show the amount disposed of and released. The total amount of toxic chemicals reported to EPA peaked in 2000 and decreased until 2009, the bottom of a recession. Over that period, the amount decreased by more than 40%. Since 2009, however, the amount has increased by about 50%. The chart shows that most toxic chemicals reported to the EPA are used in manufacturing (light blue).

(Click on chart for larger view.)

It is beyond the scope of this blog to explore why toxic chemicals reported to EPA should drop so significantly, then rebound so significantly. If you know the answer, please comment on this post and let us all know.

The increase concerns me. This series of posts started with a review of several catastrophic releases of toxic chemicals that killed people and poisoned the land, in some cases permanently. Preventing the release of toxic chemicals means that no mistakes can ever be made, and that is simply not within human capability. I view toxic chemicals as similar to time bombs. Sooner or later, one will go off.

Figure 2. Source: Environmental Protection Agency, 2018.

Of course, chemicals reported to the EPA are different from releases: what about releases? Figure 2 shows on-site toxic releases in Missouri over time. Because on-site releases represent about 93% of toxic releases in Missouri, I will let this chart represent the trend in total releases. You can see that releases peaked in 2004-2005, and have been trending downward since then. In 2017 they were about 47% of those in 2005, slightly less than half.


Figure 3. Source: Environmental Protection Agency, 2019b.

Figure 3 shows the trend nationwide. The chart I have goes back only to 2007. Since then, the total amount of releases has decreased from just 4.13 billion lb. to 3.85 billion lb., a decrease of 7%.


Because Missouri has been losing manufacturing over time, the possibility exists that the decline in toxic releases comes from the decline in manufacturing. To look at this possibility, one would want to plot toxic releases against the amount of manufacturing in the state. But what is “the total amount of manufacturing” in a state? Is it the tonnage of goods produced? The economic value produced? The number of factories, or their total square footage? The number of people employed in manufacturing? I can find statistics for manufacturing employment in Missouri, so I will use it.

Figure 4. Data sources: Environmental Protection Agency, 2018; Federal Reserve Bank of St. Louis, 2019.

Figure 4 shows the trend over time for toxic releases in Missouri (the blue line, which should be read against the left vertical axis) and manufacturing employment (the red line, which should be read against the right vertical axis). The chart considers toxic releases, not toxics managed. The chart shows that toxic releases and manufacturing employment follow similar trajectories. The correlation between the 2 data series is 0.75, which is fairly high as correlations go.

Correlation, of course, proves nothing. There is a wonderful website dedicated to spurious and absurd correlations (For instance, per capita cheese consumption correlates with the number of people who die by becoming tangled in their bedsheets at 0.95. Check out But it seems logical that the decline in toxic releases could be at least partially related to the amount of manufacturing. It would be a wonderful study for some enterprising student.

As I noted in the first post in this series, interpreting data in the TRI is complex. The most serious exposures to toxic chemicals probably happen to people who work with them regularly. You can’t assume that releases translate to public exposure. But you probably can infer the inverse: unless there is a release, the public can’t be exposed. These are poisonous chemicals. Lead, the most released chemical in Missouri, persists and accumulates in the environment and in the human body. I’m thankful that toxic releases have declined in Missouri, and I hope they continue to do so.


Environmental Protection Agency. 2018. 2017 TRI Factsheet: State – Missouri. Downloaded 3/7/2019 from

Environmental Protection Agency. 2019a. Factors to Consider When Using Toxics Release Inventory Data. Downloaded 3/20/2019 from

Environmental Protection Agency. 2019b. Toxic Release Inventory: TRI National Analysis 2017. Downloaded 3/20/2019 from

Financial Reserve Bank of St. Louis. FRED Economic Data: Manufacturing Employment in Missouri. FRED is a data portal accessed 3/7/2019 at[1][id]=MOMFGN.

Most Released Chemical

The previous post discussed the EPA’s Toxics Release Inventory. In this post, I will update toxic waste data for Missouri for 2017, the most recent year for which data is available.

Figure 1. Source: Environmental Protection Agency, 2018.

Toxics can be managed by (from most desirable to least) reducing their creation at the source, by recycling, by energy recovery, by treatment, or by disposing and/or releasing them. The Figure 1 shows Missouri managed toxic wastes in 2017. The largest amount was used to create energy (often this involves burning in a boiler). The next largest amount was recycled for reuse.





Figure 2. Source: Environmental Protection Agency, 2018.

Toxic releases can occur either onsite at the industrial facility that uses them, or offsite at some toxic materials treatment and storage facility. Materials can be released into the air, they can be discharged into surface water, and they can be dumped on the land. Figure 2 shows Missouri toxic releases by release category in 2017. By far the largest percentage, 80%, is dumped on the land.





Figure 3. Data source: Environmental Protection Agency, 2019.

The industries responsible for releasing the largest amounts of toxic chemicals are shown in Figure 3. Metal mining was the largest, accounting for almost half of all toxic releases, and electric utilities are second. Releases from the food industry continue to be larger than those from the petroleum, chemicals or plastics & rubber industries, and it continues to blow my mind! The 5 facilities that released the most toxic chemicals in 2017 were the Buick Mine/Mill, the Brushy Creek Mine/Mill, the Fletcher Mine/Mill, the Buick Resource Recycling Facility, and the Sweetwater Mine/Mill.





The chemicals that are most released in Missouri are shown in Figure 4. Compounds of lead zinc, barium and copper are the most released. The majority of these releases are probably associated with the metal mining industry, and are released on the land. The top 5 chemicals released to air and water are shown in Figure 5. Hydrogen flouride comes mostly from coal burning power plants, while nitrate compounds come mostly from the food industry (think waste from animal feeding operations).

Figure 4. Data source: Environmental Protection Agency, 2019.

Figure 5. Source: Environmental Protection Agency, 2018.










Chemicals that are persistent, bioaccumulative, and toxic (PBT) are of most concern to the EPA. These are chemicals that remain in the environment and in the body. They build up over time, meaning that repeated small releases can lead to big trouble. Lead accounts for more than 99% of PBT releases in Missouri.

Mercury compounds are also a PBT of concern. Coal burning power plants are the largest source of mercury emissions in the United States. In 2017, Mercury emissions in Missouri fell to 4,423 pounds from 9,850 pounds in 2013.

Two other classes of PBTs include polycyclic aromatics and dioxin/dioxin-like compounds. In 2017, 2,924 pounds of polycyclic aromatics were released in Missouri, and 0.08 pounds of dioxin/dioxin-like compounds. That may not seem like much dioxin, but this chemical is is toxic even in very, very small amounts. (EPA 2015).

In the next post I will look at some toxic release trends over time.


Environmental Protection Agency. 2018. 2017 TRI Factsheet: State – Missouri. Downloaded 3/7/2019 from

Environmental Protection Agency. 2015. Persistent Bioaccumulative Toxic (PBT) Chemicals Covered by the TRI Program. Viewed online 12/2/15 at

Environmental Protection Agency. 2019. TRI Explorer. This is a data portal that allows you to retrieve data regarding toxic releases by state, county, and municipality. It contains information by chemical, by releasing facility, by geography, and by industry. It also allows you to retrieve the data in a time series over several years. Viewed online 3/13/2019 at

Toxic Chemical Releases in Missouri – 2017

This post updates information on toxic chemical releases in Missouri and nationwide. The most recent data is through 2017.

Many industrial processes require the use of toxic substances. They must be properly handled to prevent harm to people, land, and water. During the 1970s and early 1980s concerns grew about how toxic substances were being handled. For instance, tons of toxic waste were discovered dumped in the Love Canal neighborhood of Niagara Falls. Oil containing dioxin was sprayed on the streets of Times Beach, Missouri, turning it into a ghost town; people can’t live there to this day. In 1984, a malfunction at a chemical plant in Bhopal, India released a cloud of poisonous gas that killed more than 3,000 people overnight, and 15,000 – 20,000 eventually (5-7 times as many as were killed in the 9/11 attacks). Shortly thereafter, a serious release of toxic gas occurred in Institute, West Virginia.

Cement Creek, Colorado, location of a toxic release in August 2015. Photo by John May.

These concerns are hardly a thing of the past, however. The same plant in Institute West Virginia exploded in 2008, killing 2 and injuring 8. In 2015, an accident at the Gold King Mine in Colorado released 3 million of gallons of water contaminated with toxics like cadmium, lead, and arsenic into Cement Creek (see Figure 1). Cement Creek flows into the Animas River, the only water source for several cities in Colorado and New Mexico.

Congress passed the Emergency Planning and Community Right-to-Know Act in 1986, and the Pollution Prevention Act in 1990. These laws require facilities to report releases, transfers, and waste management activities of toxic materials.

The Toxics Release Inventory (TRI) program of the EPA gathers this information and makes it available to the public on their website. In addition, they publish an annual report covering the whole country, plus fact sheets for each of the 50 states. The TRI data does not cover all toxic materials and all facilities, but it does cover an important set of them.

After being used, toxic substances can be managed or released into the environment. In decreasing order of preference, managing them can mean improving industrial processes to use less toxic material to start with, recycling them, burning them to generate electricity, or treating them to make them less toxic. Where toxic materials are not managed, they can be injected into wells, stored, landfilled, emitted into the air, discharged into surface water, or spread over the land. They can be handled either on-site or off-site. Determining whether any of these activities represent a potential hazard to people, land, or water is complex. One cannot simply assume, for instance, that on-site means safe. On the other hand, one cannot assume that emission or discharge of the substance means that there is toxic exposure. The statistics in the TRI are only a starting point, and many factors must be taken into consideration when analyzing TRI data.

Figure 2. Source: Environmental Protection Agency, 2018.

In 2017, 502 facilities in Missouri were covered by the Toxic Release Inventory. That’s down from 521 in 2013. Nationwide, 21,456 facilities were covered by the Toxic Release Inventory. That’s down from 21,707 in 2013. Figure 2 maps the number of sites within each county in Missouri. On the map at the TRI website, clicking on the green circle will allow you to access more detailed information for that county. Unfortunately, the TRI website does not seem to have this map available for download in a form that labels the counties. The counties with the most sites are Jackson County (42, down from 45 in 2013), Green County (30, up from 27 in 2013), and Franklin and Jasper Counties (each with 21).

Having the most TRI sites does not necessarily mean the most toxic releases. One reason is that by far the most toxic waste is managed. The Figures 3 and 4 show the data for Missouri and for the United States. About 88% of Missouri toxics were managed in 2017, only 12% were released. For the United States as a whole, a slightly higher percentage is managed (89%), but really, the percentages are similar. Even though only 12% of toxic materials are released in Missouri, that still amounts to 53 million lb.

Figure 3. Data Source: Environmental Protection Agency, 2017.

Figure 4. Data Source: Environmental Protection Agency, 2017.









In the following posts I’ll look into the releases in more detail.


Environmental Protection Agency. 2018. 2017 TRI Factsheet: State – Missouri. Downloaded 3/7/2019 from

PFAS and Public Water Contamination

A recent article by Julie Turkewitz in the New York Times reports that a group of chemicals called per- and polyfluoroalkyl substances (PFAS) have leached into at least 55 drinking water systems at military bases around the globe. The contaminated water may be causing illness in those drinking it, including tumors, thyroid problems, and debilitating fatigue.

The problem is not confined to military bases. As many as 10 million people could be drinking water laced with PFAS, according to the article.

Per- and polyfluoroalkyl substances (PFAS) are a group of man-made chemicals that are used for a large number of purposes around the globe, and they are present in many common consumer items. “There is evidence that exposure to PFAS can leads to adverse human health effects,” as the Environmental Protection Agency puts it. (Environmental Protection Agency, 2018) They are not on the list of toxic chemicals monitored by the EPA, however. After an outcry and concerted campaign by public health scientists, the EPA has agreed to do a comprehensive study of the human health effects of exposure to the chemicals, and to survey exposure levels across the country. (Davenport, 2019)

On the other hand, PFAS represent $19.7 billion in sales to chemical manufacturers, according to another article. (Lipton and Abrams, 2015) DuPont, one of the primary manufacturers, maintains that years of study and experience have proven that the chemicals are safe for their intended use. (Davenport, 2019)

Of course, leaching into the public water supply is not one of their intended uses.

Well, this blog is not about looking into the controversy around this group of chemicals. I thought I’d look and see if I could find any databases that document exposure to them. Lo-and-behold, I did. The EPA’s report on The Third Unregulated Contaminant Monitoring Rule (UCMR 3): Data Summary, January 2017 contains some information. The National Environmental Public Health Tracking Network contains more.

The data in the EPA report only has reports on PFAS from public water systems serving fewer than 10,000 people. Thus, it represents a small fraction of all public water systems, and an even smaller fraction of the number of people served by public water systems. On the other hand, the data mapped by the National Environmental Public Health Tracking Network includes many of the largest public water systems, and covers a substantial portion of the population, at least in Missouri.

In both sets of data, the group of chemicals are listed individually, and it is not clear how much the water systems contaminated by one overlap those contaminated by another. Finally, for several of the chemicals no reference level has been determined. Think of a reference level as something similar to the maximum safe exposure level. Some of these chemicals have not yet been studied that way, that’s why the EPA is undertaking the study mentioned above.

Table 1 lists the chemicals and the number of public water systems in which they were detected from the EPA report:

Table 1. Public Water Systems Contaminated with PFAS:

Chemical Number of Water Systems in Which It Was Detected
PFOA 117
PFHx5 55
PFHpA 86

Data Source: Environmental Protection Agency, 2017.

Figures 1-3 map the locations of water systems where PFAS were detected, from the National Environmental Public Health Tracking Network. Each map is labelled with the specific chemical being mapped. The orange dots show a single system, the grey dots with numbers show locations where more than one water system had PFAS, but the scale of the map is too small to show them both. The number inside the grey circle shows the number of water systems in that locale where the chemical was detected.

Figure 2. PFOS and PFOA Contamination. Source: Centers for Disease Control, 2019.

Figure 1. PFHpA and PFBS Contamination. Source: Centers for Disease Control, 2019.

Figure 3. PFNA and PFHxS Contamination. Source: Centers for Disease Control, 2019.

Although 87 public water systems in Missouri were sampled, none reported detectable PFAS. The sample covered public water systems that served 3,802,254 people, and included the City of St. Louis Public Water System, the Missouri American St. Louis-St. Charles County Water System, the Kansas City Public Water System, 2 Jackson County Public Water Systems, the Springfield Public Water System, as well as many other large public water systems. The sampling included the Ft. Leonard Wood water system, and no contamination was detected.

For right now, this data seems to suggest that the public drinking water in Missouri may not be contaminated with these chemicals. I wouldn’t say this is the last word, however. The EPA study will hopefully give us a more comprehensive analysis of what the health effects of these chemicals are, how much exposure of what kind is safe, and how much contamination is out there.


Centers for Disease Control. 2019. National Environmental Public Health Tracking Network. Viewed online 2/22/2019 at

Davenport, Coral. 2019. “E.P.A. Will Study Limits on Cancer-Linked Chemicals. Critisc Say the Plan Delays Action.” The New York Times, 2/14/2019. Viewed online 2/22/2019 at

Environmental Protection Agency. 2018. Basic Information on PFAS. Viewed online 2/22/2019 at

Environmental Protection Agency. 2017. The Third Unregulated Contaminant Monitoring Rule (UMCR 3): Data Summary, January 2017. Viewed online 2/22/2019 at

Lipton, Eric, and Rachel Abrams. 2015. “Commonly Used Chemicals Come Under New Scrutiny.” The New York Times, 5/1/2015. Viewed online 2/22/2019 at

Cold Winters and Phony Baloney (at least in Missouri)

This week I returned to St. Louis after being out of town for some time. I was greeted by a chorus of moans and groans about the horrible winter. Such kvetching! Of course, it is easy for somebody who has been in warmer climes to pooh-pooh the harshness of the winter back home. So, I decided to look and see what the statistics say, and since that is the focus of this blog, to do a post on what I found. I’m going to look at the winter in St. Louis and in Kansas City. For weather statistics, winter begins December 1st and ends February 28th (or 29th in leap years). I’m writing on February 21, so the data for this winter extends only through 2/20/2019. One final note: for grammatical reasons, in what follows, “normal” means historical average (mean).

The weather service office in each location keeps its data in slightly different formats, so I will do one, then the other.

Winter 2018-2019 in St. Louis

First, let’s ask if it has been excessively cold in St. Louis this winter. According to the National Weather Service, the record low temperature in St. Louis is -22°F, which occurred 1/5/1884. The observed low this winter was -6°F, on 1/20/19: cold, but nowhere near the record. For the 82 days from 12/1/18 through 2/20/19, on 57 of the days the record cold for that date has been -5°F or colder. This year, the low temperature has been nothing like that.

Figure 1. Data source: NOAA, National Weather Service, St. Louis Forecast Office.

Well, you may say, perhaps the low temperature has not set records, but on most days it has been lower than normal. Figure 1 shows the daily observed low temperature compared to the normal low temperature for that date. The blue line shows the observed temperature for 2018-19, and the red line shows the normal low temperature for that date. The chart suggests that for much of the winter, the low temperature in St. Louis has actually been above normal. There have been a few cold outbreaks, but not record cold. The observed low temperatures over the period this winter have averaged 27°F. The normal low temperatures over the period have averaged 26°F. So guess what? The average low temperature this year has been about a degree above normal.






Figure 2. Data source: NOAA, National Weather Service, St. Louis Forecast Office.

Well, you may say, perhaps the low temperature has not been excessively low, but the daily high temperature has been colder than usual. It’s not the deep lows of the night that has gotten to us, it’s the fact that it hasn’t warmed during the day. Figure 2 shows the daily observed high temperatures for 2018-19 (blue line), and the normal high temperature for those dates (red line). The chart shows that during the cold outbreaks noted above, the high temperature has, indeed, been cooler than normal. But much of the winter has also had highs above normal. Over the period, the observed highs this winter have averaged 43°F, while over the period, normal highs averaged 42°F.

Winter 2018-19 in Kansas City

The National Weather Service Office in Kansas City does not seem to publish a data series that contains information similar to the one published by the office in St. Louis. I have used, instead, data from the Climate-at-a-Glance data portal. This data does not include daily values, only monthly averages. Plus, it only extends through the end of January. January 19 was the coldest day of this winter, however, so it is included. Data collection began in 1972-73.

Figure 3. Source: Climate-at-a-Glance.

Figure 3 shows the data, with the blue line representing the observed values, and the gray line representing the average. The average temperature in Kansas City this winter was 2.5°F above normal.

The month of February to date can be included by using heating degree days instead of temperatures. Heating degree days are a measure designed to indicate to what degree the interior of buildings will require heating. To calculate it, average a day’s high and low temperature, then subtract the result from 65. This is how many heating degree days there were on that day. Now, to measure a period of time, simply sum the heating degree days for each day in the period.

The problem here is that the data in the climate summaries, where the heating degree data is published, use a different period to determine normal than does the data above. The data above uses values that run from when record keeping started to the current date. The climate summaries use data from 1981-2010. It was around 1980 that the effects of climate change really kicked in. This results in different estimates of “normal,” with the climate summary referencing only recent (warmer) history, and the other data referencing much longer (cooler) periods of time.

That said, it is the only way I can think of to include February for Kansas City in this discussion, so this is what the data shows:

Observed Heating Degree Days Normal Heating Degree Days Difference
December 2018 928 1040 -112
January 2019 1135 1114 21
February 1-20 2019 731 657


Looked at this way, it would appear that December created about 11% fewer degree days than normal, but January and February (to date) have created about 2% and 11% more, respectively. If you sum the differences for the 3 months together, then the winter to date has created 17 more heating degree days than normal, a trivial amount: in terms of heating degree days, Kansas City’s winter in 2018-19 should be understood to be roughly normal.

Now, none of this speaks to snow or blizzards. I understand that the winter storm at the end of January was a terrible event. In a similar fashion, I was in Hawaii when the winter cyclone came ashore in early February. I saw whole fields of banana trees leveled, just snapped off mid-trunk. On the top of Mauna Kea, the wind was recorded at 190+ mph. None of that changes the fact, however, that Hawaii has a lovely climate, and it was a wonderful place to visit (although too crowded these days, I’d say). Same in St. Louis. This blog is more concerned with statistical trends than individual events, and none of the statistics suggest that this has been, on average, a freakishly cold winter.

I read that people who believe in climate change are being peppered with the question “If the Earth is warming so much, how come it is so cold?” Nobody ever said that climate change would banish all cold, and the predictions are for more intense storms, just like the ones referenced above. But the real answer seems to be that it isn’t actually so cold, at least not here in Missouri. The whole question is nothing but phony baloney, at least here in Missouri.


NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, retrieved on February 21, 2018 from

NOAA, National Weather Service, Kansas City/Pleasant Hill Forecast Office. 2/21/2019. Daily Climate Report. For this post, I used reports for 12/31/2018, 1/31/2019, and 2/20/2019. Viewed online 2/21/2019

NOAA, National Weather Service, St. Louis Forecast Office. 2/21/2019. Climate Graphs. Data retrieved on 2/21/2019 from

2018 Was Wetter Than Usual in Missouri

2018 was the 3rd wettest year on record across the contiguous USA.

Figure 1. Source: NOAA Climate-at-a-Glance

So says data from Climate-At-A-Glance, the data portal operated by the National Oceanographic and Atmospheric Administration (NOAA). Figure 1 shows the data, with the green line representing actual yearly precipitation, and the blue line representing the trend across time. The left vertical scale shows inches of precipitation, while the right shows millimeters of precipitation. In 2018, the average precipitation across the contiguous USA was 34.62 inches, which was the 3rd highest amount in the record. Over time, precipitation seems to be increasing at about 0.18 inches per decade. The trend towards more precipitation is present in the Eastern Climate Region (+0.30 inches per decade), the Southern Climate Region (+0.24 inches per decade), and the Central Climate Region (+0.23 inches per decade). It is almost absent in the Western Climate Region, however (+0.02 inches per decade). In fact, 2018 was a below-average precipitation year in the West. (Except where noted, data is from the Climate-at-a-Glance data portal.)

(Click on figure for larger view.)

Figure 2. Source: NOAA Climate-at-a-Glance.

In Missouri, 2018 was the 41st wettest year on record, with 43.04 inches of precipitation. (Figure 2) This puts the year 2.54 inches above the long-term average. As expected, the variation from year-to-year is much larger than the change in precipitation over time, but since 1895 Missouri has trended towards about 0.24 inches more precipitation per decade.

Unlike 2016 and 2017, 2018 did not bring epic flooding to Missouri. Perhaps the most notable thing about Missouri precipitation in 2018 were two almost out of season snow events – one over the Easter weekend in April, and one in mid-November. The latter heralded what has been a very snowy winter so far in 2019 for Missouri and much of the Midwest.

Figure 3. Source: Source: NOAA Climate-at-a-Glance.

The Northern Rockies and Plains are where most of the water that flows into the Missouri River originates, and the Missouri River provides water to more Missourians than any other source. This region saw 24.83 inches of precipitation in 2017, some 5.82 inches above average. (Figure 3) As expected, the variation between years is much larger than the change over time, but here, too, precipitation has been increasing, though the change has only been +0.07 inches per decade.

What to watch for in Missouri, then, does not appear to be a decrease in average yearly precipitation, but two other issues. First, demand for water has been increasing. Will it grow to outstrip the supply? Second, this winter notwithstanding, climate change is causing precipitation that once fell as snow to fall as rain. This changes the timing of when the Missouri River receives the runoff. Will that affect the ability of the river to supply water to meet the various demands? So far, these answers are not known. (For a more extended discussion, see here.)

Figure 4. Source: Source: NOAA Climate-at-a-Glance.

The water situation in California is more serious than it is in the Northern Rockies and Plains, Missouri, or contiguous USA. California has a monsoonal precipitation pattern, and it has regions that receive a great deal of precipitation, while other regions receive little, if any. Consequently, the state relies on snowfall during the winter, which runs off during the spring and early summer, and is collected into reservoirs. This water is then distributed around the state. Thus, the amount of water contained in the snowpack on April 1, which is when it historically started melting in earnest, has been seen to be crucial to California’s water status.

After a big water year in 2017, 2018 returned to below-average precipitation. It was the 34th driest year on record, with precipitation 4.54 inches below average. (Figure 4)

As I reported previously, the California snow season started slowly this winter. It has been catching up, and is now nearly average for this date. The snowpack is above average in the Colorado River Basin above Lake Powell, the other major source for California’s water. The snowpack is 110% of the average for this date. (National Resource Conservation Service, 2/14/2018).


California Data Exchange Center, Department of Water Resources. Current Year Regional Snow Sensor Water Content Chart (PDF). Downloaded 1/22/2018 from

Mammoth Mountain Ski Area. 2018. Snow Conditions and Weather: Snow History. Viewed online 1/15/2018 at NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, published January 2018, retrieved on January 15, 2018 from

Natural Resource Conservation Service, U.S. Department of Agriculture. Upper Colorado River Basin SNOTEL Snowpack Update Report. Viewed online 1/28/2018 at

NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, published January 2018, retrieved on January 15, 2018 from

%d bloggers like this: