The number of natural disasters in the USA in 2018 declined from 2017, and the amount of damage they did also declined.
Since 1980, there have been 241 severe weather and/or climate events in the USA that have caused damages exceeding $1.6 trillion dollars. The most important of these are the “billion-dollar disasters,” events that caused damages in excess of $1 billion. Figure 1 shows the data. The columns represent the number of events, with each color representing a different type of event. The black line represents a moving 5-year average number of events. They gray line represents the cost of damages, in billions of dollars.
In 2018, there were 14 billion-dollar disasters. That is more than double the long-term average. The 4 years with the most billion dollar disasters have all occurred in the last 8 years, and the last 3 years have ranked #1 (tie), #3, and #4. The last 3 years have been significantly higher than any other years except 2011. The increase comes primarily from severe storms, a category that excludes hurricanes and tropical cyclones, flooding and winter storms. These are tornadoes, thunderstorms, hail, and similar kinds of storms.
The estimated CPI-adjusted losses in 2018 were $91 billion. That’s quite a chunk of change, but it much less than the costs in 2017 ($312.7 billion), 2005 ($220.8 billion), or 2012 ($128.6 billion). 2017 was a terrible year (see here): Hurricanes Harvey, Irma, and Maria struck in 2017, and not only was it far-and-away the most costly year, it also tied for the highest number of disasters. Hurricane Katrina struck in 2005, and Hurricane Sandy struck in 2012. Hurricane Katrina is still the single event that caused more damage than any other, followed by Hurricanes Harvey and Maria. Last year, the most costly natural disaster was the series of wildfires in California, especially the Camp Fire. The fires caused an estimated $24 billion in damage.
Looking at Figure 1, starting somewhere around 2005, the number of billion-dollar disasters starts to trend upward. The amount of damage does, too, though the variation from year-to-year is much greater. The National Centers for Environmental Information, which keeps this data, factors the Consumer Price Index into it, so the change is probably not due to inflation.
Three factors probably account for most of the increase. First, climate change has caused an increase in the amount of energy in the atmosphere, energy that is available to power more and bigger storms. I once calculated that the increased radiative forcing from climate change was equal to the energy output of 1.6 million nuclear power plants. (See here) That’s a lot of energy available to power storms. Second, climate change has caused an increase in droughts throughout the western United States. Even in years like this one, when there has been an abundance of winter snow, the warm temperatures cause the snow to melt earlier in the spring, and they dry out the land faster during the summer. The result is a tinder box, perfect conditions for huge wildfires. And finally, we keep putting ourselves in harm’s way. Development has increased along the Atlantic and Gulf Coasts, where it is in the path of any hurricane that comes ashore. Development has also increased along the fringes of forests, where it is vulnerable to wildfire. And even in the middle of the country, sprawl has increased the built-up area, making tornadoes more likely to grind over it, as opposed to farmland.
Missouri was in the region damaged by some of the big weather events of 2018, so the next post will look at how we fared here in the Show Me State.
The National Centers for Environmental Information Billion-Dollar Weather and Climate Disasters portal has 5 pages available, and I used them all for this post: Overview, Mapping, Time Series, Summary Stats, and Table of Events. Downloaded and viewed online 4/3/2019 at https://www.ncdc.noaa.gov/billions.
If you listen to the national media, especially the conservative media, you might think that interest in environmental protection has gone the way of the dinosaur, swept away by a national consensus to focus on economic growth at all costs.
Gallup Inc. is a global analytics and analysis organization that conducts what we all know as Gallup Polls. One of the questions they regularly include in their polls is “With which one of these statements do you most agree – protection of the environment should be given priority, even at the risk of curbing economic growth (or) economic growth should be given priority, even if the environment suffers to some extent?” The results are shown in Figure 1. It is a chart from Gallup showing the percent of respondents choosing the environment over the economy, and vice-versa. To it I have added a line showing republican and democratic presidencies (red and blue, respectively).
In 1990, people chose environmental protection over economic growth 71% to 19%. That is a huge majority! Support for the environment weakened starting in 2000. By 2010, respondents chose the economy over the environment 53% to 38%. Since then, support for the environment has rebounded, with the most recent figures showing respondents picking the environment over the economy by 57% to 35%. (The percentages don’t sum to 100% because some people answer that they don’t have a preference, or they decline to answer the question.)
Even 57% to 35% is a large majority – a difference of 32%: for every 10 people who chose the economy, 17 chose the environment. No American president has ever been elected by such a margin. Warren Harding comes closest (!), with a margin of 26%. Some recent “landslides” involved margins of 23% (Nixon over McGovern, 1972, and Johnson over Goldwater, 1964). Even the famous Reagan “landslide” of 1984 (Reagan over Mondale) was only 18%. The current Tweeter-in-Chief, due to a quirk in the electoral college system, was elected with a minority of the vote (-3%).
Since 1998, Gallup has also asked respondents “Is the seriousness of global warming generally exaggerated, generally correct, or is it generally underestimated?” Figure 2 shows the results. The dark green line shows the number of people who think the seriousness of global warming is exaggerated. The dark black line shows the number of people who think its seriousness is generally correctly portrayed. The gray line shows the percent of people who think it is underestimated.
A minority of people think that the seriousness of global warming is correctly represented – that is constant across the whole time period. For much of the period, more people thought its seriousness was exaggerated than thought it was underestimated. In recent years, that has shifted, and now more people think it is underestimated (41%) than think it is exaggerated (33%).
Where would I fit on that last question? I think I would refuse to answer it. I feel that global warming is one of the most significant challenges facing humanity. But I feel that it is a slow-motion catastrophe. Just as a simple example: if you go to Miami Beach or Lower Manhattan in 100 years, you are likely to find they are very different places, struggling to cope with flooding, sometimes more severe, sometimes milder. However, that is a change that will unfold over the entire coming century, giving people lots of time to adapt and adjust. Thus, those who say the danger is fabricated are underestimating it. On the other hand, those who say an existential catastrophe is imminent are exaggerating.
What is true is that the carbon that goes into the atmosphere stays there for nearly a century. Thus, if we don’t act quickly, we most likely doom ourselves to a change that will unfold over decades, and which we will be impotent to prevent.
As the two figures above illustrate, environmental concerns have NOT been swept away our current president. Rather, he is acting to prevent Americans from addressing problems that they feel are important, even if it involves some economic sacrifice.
Gallup Inc. 2018. In Depth: Topics A to Z: Environment. Downloaded 3/27/2019 from https://news.gallup.com/poll/1615/environment.aspx.
The last 2 posts have reported specifics on some of the toxic chemicals released into the environment in Missouri in 2017. This post will broaden the view and discuss trends in toxic releases over time.
To some extent it is difficult to follow trends over time because of changes in the TRI program. Figure 1 shows the national trend in chemicals reported to the EPA over time, and it labels the years in which major changes have occurred in the program. The light blue and sandy yellow show the total amount of toxic chemicals reported, the dark blue and orange show the amount disposed of and released. The total amount of toxic chemicals reported to EPA peaked in 2000 and decreased until 2009, the bottom of a recession. Over that period, the amount decreased by more than 40%. Since 2009, however, the amount has increased by about 50%. The chart shows that most toxic chemicals reported to the EPA are used in manufacturing (light blue).
(Click on chart for larger view.)
It is beyond the scope of this blog to explore why toxic chemicals reported to EPA should drop so significantly, then rebound so significantly. If you know the answer, please comment on this post and let us all know.
The increase concerns me. This series of posts started with a review of several catastrophic releases of toxic chemicals that killed people and poisoned the land, in some cases permanently. Preventing the release of toxic chemicals means that no mistakes can ever be made, and that is simply not within human capability. I view toxic chemicals as similar to time bombs. Sooner or later, one will go off.
Of course, chemicals reported to the EPA are different from releases: what about releases? Figure 2 shows on-site toxic releases in Missouri over time. Because on-site releases represent about 93% of toxic releases in Missouri, I will let this chart represent the trend in total releases. You can see that releases peaked in 2004-2005, and have been trending downward since then. In 2017 they were about 47% of those in 2005, slightly less than half.
Figure 3 shows the trend nationwide. The chart I have goes back only to 2007. Since then, the total amount of releases has decreased from just 4.13 billion lb. to 3.85 billion lb., a decrease of 7%.
Because Missouri has been losing manufacturing over time, the possibility exists that the decline in toxic releases comes from the decline in manufacturing. To look at this possibility, one would want to plot toxic releases against the amount of manufacturing in the state. But what is “the total amount of manufacturing” in a state? Is it the tonnage of goods produced? The economic value produced? The number of factories, or their total square footage? The number of people employed in manufacturing? I can find statistics for manufacturing employment in Missouri, so I will use it.
Figure 4 shows the trend over time for toxic releases in Missouri (the blue line, which should be read against the left vertical axis) and manufacturing employment (the red line, which should be read against the right vertical axis). The chart considers toxic releases, not toxics managed. The chart shows that toxic releases and manufacturing employment follow similar trajectories. The correlation between the 2 data series is 0.75, which is fairly high as correlations go.
Correlation, of course, proves nothing. There is a wonderful website dedicated to spurious and absurd correlations (For instance, per capita cheese consumption correlates with the number of people who die by becoming tangled in their bedsheets at 0.95. Check out http://www.tylervigen.com/spurious-correlations.) But it seems logical that the decline in toxic releases could be at least partially related to the amount of manufacturing. It would be a wonderful study for some enterprising student.
As I noted in the first post in this series, interpreting data in the TRI is complex. The most serious exposures to toxic chemicals probably happen to people who work with them regularly. You can’t assume that releases translate to public exposure. But you probably can infer the inverse: unless there is a release, the public can’t be exposed. These are poisonous chemicals. Lead, the most released chemical in Missouri, persists and accumulates in the environment and in the human body. I’m thankful that toxic releases have declined in Missouri, and I hope they continue to do so.
Environmental Protection Agency. 2018. 2017 TRI Factsheet: State – Missouri. Downloaded 3/7/2019 from https://iaspub.epa.gov/triexplorer/tri_factsheet.factsheet_forstate&pstate=MO&pyear=2017&pParent=TRI&pDataSet=TRIQ1pZip=&pCity=&pCounty=&pState=MO&pYear=2013&pDataSet=TRIQ2&pParent=NAT&pPrint=1.
Environmental Protection Agency. 2019a. Factors to Consider When Using Toxics Release Inventory Data. Downloaded 3/20/2019 from https://www.epa.gov/sites/production/files/2019-03/documents/factors_to_consider_march_2019.pdf.
Environmental Protection Agency. 2019b. Toxic Release Inventory: TRI National Analysis 2017. Downloaded 3/20/2019 from https://www.epa.gov/trinationalanalysis/report-sections-tri-national-analysis.
Financial Reserve Bank of St. Louis. FRED Economic Data: Manufacturing Employment in Missouri. FRED is a data portal accessed 3/7/2019 at http://research.stlouisfed.org/fred2/graph/?s[id]=MOMFGN.
The previous post discussed the EPA’s Toxics Release Inventory. In this post, I will update toxic waste data for Missouri for 2017, the most recent year for which data is available.
Toxics can be managed by (from most desirable to least) reducing their creation at the source, by recycling, by energy recovery, by treatment, or by disposing and/or releasing them. The Figure 1 shows Missouri managed toxic wastes in 2017. The largest amount was used to create energy (often this involves burning in a boiler). The next largest amount was recycled for reuse.
Toxic releases can occur either onsite at the industrial facility that uses them, or offsite at some toxic materials treatment and storage facility. Materials can be released into the air, they can be discharged into surface water, and they can be dumped on the land. Figure 2 shows Missouri toxic releases by release category in 2017. By far the largest percentage, 80%, is dumped on the land.
The industries responsible for releasing the largest amounts of toxic chemicals are shown in Figure 3. Metal mining was the largest, accounting for almost half of all toxic releases, and electric utilities are second. Releases from the food industry continue to be larger than those from the petroleum, chemicals or plastics & rubber industries, and it continues to blow my mind! The 5 facilities that released the most toxic chemicals in 2017 were the Buick Mine/Mill, the Brushy Creek Mine/Mill, the Fletcher Mine/Mill, the Buick Resource Recycling Facility, and the Sweetwater Mine/Mill.
The chemicals that are most released in Missouri are shown in Figure 4. Compounds of lead zinc, barium and copper are the most released. The majority of these releases are probably associated with the metal mining industry, and are released on the land. The top 5 chemicals released to air and water are shown in Figure 5. Hydrogen flouride comes mostly from coal burning power plants, while nitrate compounds come mostly from the food industry (think waste from animal feeding operations).
Chemicals that are persistent, bioaccumulative, and toxic (PBT) are of most concern to the EPA. These are chemicals that remain in the environment and in the body. They build up over time, meaning that repeated small releases can lead to big trouble. Lead accounts for more than 99% of PBT releases in Missouri.
Mercury compounds are also a PBT of concern. Coal burning power plants are the largest source of mercury emissions in the United States. In 2017, Mercury emissions in Missouri fell to 4,423 pounds from 9,850 pounds in 2013.
Two other classes of PBTs include polycyclic aromatics and dioxin/dioxin-like compounds. In 2017, 2,924 pounds of polycyclic aromatics were released in Missouri, and 0.08 pounds of dioxin/dioxin-like compounds. That may not seem like much dioxin, but this chemical is is toxic even in very, very small amounts. (EPA 2015).
In the next post I will look at some toxic release trends over time.
Environmental Protection Agency. 2018. 2017 TRI Factsheet: State – Missouri. Downloaded 3/7/2019 from https://iaspub.epa.gov/triexplorer/tri_factsheet.factsheet_forstate&pstate=MO&pyear=2017&pParent=TRI&pDataSet=TRIQ1pZip=&pCity=&pCounty=&pState=MO&pYear=2013&pDataSet=TRIQ2&pParent=NAT&pPrint=1.
Environmental Protection Agency. 2015. Persistent Bioaccumulative Toxic (PBT) Chemicals Covered by the TRI Program. Viewed online 12/2/15 at http://www2.epa.gov/toxics-release-inventory-tri-program/persistent-bioaccumulative-toxic-pbt-chemicals-covered-tri.
Environmental Protection Agency. 2019. TRI Explorer. This is a data portal that allows you to retrieve data regarding toxic releases by state, county, and municipality. It contains information by chemical, by releasing facility, by geography, and by industry. It also allows you to retrieve the data in a time series over several years. Viewed online 3/13/2019 at https://iaspub.epa.gov/triexplorer/tri_release.chemical.
This post updates information on toxic chemical releases in Missouri and nationwide. The most recent data is through 2017.
Many industrial processes require the use of toxic substances. They must be properly handled to prevent harm to people, land, and water. During the 1970s and early 1980s concerns grew about how toxic substances were being handled. For instance, tons of toxic waste were discovered dumped in the Love Canal neighborhood of Niagara Falls. Oil containing dioxin was sprayed on the streets of Times Beach, Missouri, turning it into a ghost town; people can’t live there to this day. In 1984, a malfunction at a chemical plant in Bhopal, India released a cloud of poisonous gas that killed more than 3,000 people overnight, and 15,000 – 20,000 eventually (5-7 times as many as were killed in the 9/11 attacks). Shortly thereafter, a serious release of toxic gas occurred in Institute, West Virginia.
These concerns are hardly a thing of the past, however. The same plant in Institute West Virginia exploded in 2008, killing 2 and injuring 8. In 2015, an accident at the Gold King Mine in Colorado released 3 million of gallons of water contaminated with toxics like cadmium, lead, and arsenic into Cement Creek (see Figure 1). Cement Creek flows into the Animas River, the only water source for several cities in Colorado and New Mexico.
Congress passed the Emergency Planning and Community Right-to-Know Act in 1986, and the Pollution Prevention Act in 1990. These laws require facilities to report releases, transfers, and waste management activities of toxic materials.
The Toxics Release Inventory (TRI) program of the EPA gathers this information and makes it available to the public on their website. In addition, they publish an annual report covering the whole country, plus fact sheets for each of the 50 states. The TRI data does not cover all toxic materials and all facilities, but it does cover an important set of them.
After being used, toxic substances can be managed or released into the environment. In decreasing order of preference, managing them can mean improving industrial processes to use less toxic material to start with, recycling them, burning them to generate electricity, or treating them to make them less toxic. Where toxic materials are not managed, they can be injected into wells, stored, landfilled, emitted into the air, discharged into surface water, or spread over the land. They can be handled either on-site or off-site. Determining whether any of these activities represent a potential hazard to people, land, or water is complex. One cannot simply assume, for instance, that on-site means safe. On the other hand, one cannot assume that emission or discharge of the substance means that there is toxic exposure. The statistics in the TRI are only a starting point, and many factors must be taken into consideration when analyzing TRI data.
In 2017, 502 facilities in Missouri were covered by the Toxic Release Inventory. That’s down from 521 in 2013. Nationwide, 21,456 facilities were covered by the Toxic Release Inventory. That’s down from 21,707 in 2013. Figure 2 maps the number of sites within each county in Missouri. On the map at the TRI website, clicking on the green circle will allow you to access more detailed information for that county. Unfortunately, the TRI website does not seem to have this map available for download in a form that labels the counties. The counties with the most sites are Jackson County (42, down from 45 in 2013), Green County (30, up from 27 in 2013), and Franklin and Jasper Counties (each with 21).
Having the most TRI sites does not necessarily mean the most toxic releases. One reason is that by far the most toxic waste is managed. The Figures 3 and 4 show the data for Missouri and for the United States. About 88% of Missouri toxics were managed in 2017, only 12% were released. For the United States as a whole, a slightly higher percentage is managed (89%), but really, the percentages are similar. Even though only 12% of toxic materials are released in Missouri, that still amounts to 53 million lb.
In the following posts I’ll look into the releases in more detail.
Environmental Protection Agency. 2018. 2017 TRI Factsheet: State – Missouri. Downloaded 3/7/2019 from https://iaspub.epa.gov/triexplorer/tri_factsheet.factsheet_forstate?&pstate=MO&pyear=2017&pParent=TRI&pDataSet=TRIQ1pZip=&pCity=&pCounty=&pState=MO&pYear=2013&pDataSet=TRIQ2&pParent=NAT&pPrint=1.
A recent article by Julie Turkewitz in the New York Times reports that a group of chemicals called per- and polyfluoroalkyl substances (PFAS) have leached into at least 55 drinking water systems at military bases around the globe. The contaminated water may be causing illness in those drinking it, including tumors, thyroid problems, and debilitating fatigue.
The problem is not confined to military bases. As many as 10 million people could be drinking water laced with PFAS, according to the article.
Per- and polyfluoroalkyl substances (PFAS) are a group of man-made chemicals that are used for a large number of purposes around the globe, and they are present in many common consumer items. “There is evidence that exposure to PFAS can leads to adverse human health effects,” as the Environmental Protection Agency puts it. (Environmental Protection Agency, 2018) They are not on the list of toxic chemicals monitored by the EPA, however. After an outcry and concerted campaign by public health scientists, the EPA has agreed to do a comprehensive study of the human health effects of exposure to the chemicals, and to survey exposure levels across the country. (Davenport, 2019)
On the other hand, PFAS represent $19.7 billion in sales to chemical manufacturers, according to another article. (Lipton and Abrams, 2015) DuPont, one of the primary manufacturers, maintains that years of study and experience have proven that the chemicals are safe for their intended use. (Davenport, 2019)
Of course, leaching into the public water supply is not one of their intended uses.
Well, this blog is not about looking into the controversy around this group of chemicals. I thought I’d look and see if I could find any databases that document exposure to them. Lo-and-behold, I did. The EPA’s report on The Third Unregulated Contaminant Monitoring Rule (UCMR 3): Data Summary, January 2017 contains some information. The National Environmental Public Health Tracking Network contains more.
The data in the EPA report only has reports on PFAS from public water systems serving fewer than 10,000 people. Thus, it represents a small fraction of all public water systems, and an even smaller fraction of the number of people served by public water systems. On the other hand, the data mapped by the National Environmental Public Health Tracking Network includes many of the largest public water systems, and covers a substantial portion of the population, at least in Missouri.
In both sets of data, the group of chemicals are listed individually, and it is not clear how much the water systems contaminated by one overlap those contaminated by another. Finally, for several of the chemicals no reference level has been determined. Think of a reference level as something similar to the maximum safe exposure level. Some of these chemicals have not yet been studied that way, that’s why the EPA is undertaking the study mentioned above.
Table 1 lists the chemicals and the number of public water systems in which they were detected from the EPA report:
Table 1. Public Water Systems Contaminated with PFAS:
|Chemical||Number of Water Systems in Which It Was Detected|
Data Source: Environmental Protection Agency, 2017.
Figures 1-3 map the locations of water systems where PFAS were detected, from the National Environmental Public Health Tracking Network. Each map is labelled with the specific chemical being mapped. The orange dots show a single system, the grey dots with numbers show locations where more than one water system had PFAS, but the scale of the map is too small to show them both. The number inside the grey circle shows the number of water systems in that locale where the chemical was detected.
Although 87 public water systems in Missouri were sampled, none reported detectable PFAS. The sample covered public water systems that served 3,802,254 people, and included the City of St. Louis Public Water System, the Missouri American St. Louis-St. Charles County Water System, the Kansas City Public Water System, 2 Jackson County Public Water Systems, the Springfield Public Water System, as well as many other large public water systems. The sampling included the Ft. Leonard Wood water system, and no contamination was detected.
For right now, this data seems to suggest that the public drinking water in Missouri may not be contaminated with these chemicals. I wouldn’t say this is the last word, however. The EPA study will hopefully give us a more comprehensive analysis of what the health effects of these chemicals are, how much exposure of what kind is safe, and how much contamination is out there.
Centers for Disease Control. 2019. National Environmental Public Health Tracking Network. Viewed online 2/22/2019 at https://ephtracking.cdc.gov/DataExplorer/#/.
Davenport, Coral. 2019. “E.P.A. Will Study Limits on Cancer-Linked Chemicals. Critisc Say the Plan Delays Action.” The New York Times, 2/14/2019. Viewed online 2/22/2019 at https://www.nytimes.com/2019/02/14/climate/epa-chemical-plan-pfas.html?module=inline.
Environmental Protection Agency. 2018. Basic Information on PFAS. Viewed online 2/22/2019 at https://www.epa.gov/pfas/basic-information-pfas.
Environmental Protection Agency. 2017. The Third Unregulated Contaminant Monitoring Rule (UMCR 3): Data Summary, January 2017. Viewed online 2/22/2019 at https://www.epa.gov/sites/production/files/2017-02/documents/ucmr3-data-summary-january-2017.pdf.
Lipton, Eric, and Rachel Abrams. 2015. “Commonly Used Chemicals Come Under New Scrutiny.” The New York Times, 5/1/2015. Viewed online 2/22/2019 at https://www.nytimes.com/2015/05/01/business/commonly-used-chemicals-come-under-new-scrutiny.html?module=inline.
This week I returned to St. Louis after being out of town for some time. I was greeted by a chorus of moans and groans about the horrible winter. Such kvetching! Of course, it is easy for somebody who has been in warmer climes to pooh-pooh the harshness of the winter back home. So, I decided to look and see what the statistics say, and since that is the focus of this blog, to do a post on what I found. I’m going to look at the winter in St. Louis and in Kansas City. For weather statistics, winter begins December 1st and ends February 28th (or 29th in leap years). I’m writing on February 21, so the data for this winter extends only through 2/20/2019. One final note: for grammatical reasons, in what follows, “normal” means historical average (mean).
The weather service office in each location keeps its data in slightly different formats, so I will do one, then the other.
Winter 2018-2019 in St. Louis
First, let’s ask if it has been excessively cold in St. Louis this winter. According to the National Weather Service, the record low temperature in St. Louis is -22°F, which occurred 1/5/1884. The observed low this winter was -6°F, on 1/20/19: cold, but nowhere near the record. For the 82 days from 12/1/18 through 2/20/19, on 57 of the days the record cold for that date has been -5°F or colder. This year, the low temperature has been nothing like that.
Well, you may say, perhaps the low temperature has not set records, but on most days it has been lower than normal. Figure 1 shows the daily observed low temperature compared to the normal low temperature for that date. The blue line shows the observed temperature for 2018-19, and the red line shows the normal low temperature for that date. The chart suggests that for much of the winter, the low temperature in St. Louis has actually been above normal. There have been a few cold outbreaks, but not record cold. The observed low temperatures over the period this winter have averaged 27°F. The normal low temperatures over the period have averaged 26°F. So guess what? The average low temperature this year has been about a degree above normal.
Well, you may say, perhaps the low temperature has not been excessively low, but the daily high temperature has been colder than usual. It’s not the deep lows of the night that has gotten to us, it’s the fact that it hasn’t warmed during the day. Figure 2 shows the daily observed high temperatures for 2018-19 (blue line), and the normal high temperature for those dates (red line). The chart shows that during the cold outbreaks noted above, the high temperature has, indeed, been cooler than normal. But much of the winter has also had highs above normal. Over the period, the observed highs this winter have averaged 43°F, while over the period, normal highs averaged 42°F.
Winter 2018-19 in Kansas City
The National Weather Service Office in Kansas City does not seem to publish a data series that contains information similar to the one published by the office in St. Louis. I have used, instead, data from the Climate-at-a-Glance data portal. This data does not include daily values, only monthly averages. Plus, it only extends through the end of January. January 19 was the coldest day of this winter, however, so it is included. Data collection began in 1972-73.
Figure 3 shows the data, with the blue line representing the observed values, and the gray line representing the average. The average temperature in Kansas City this winter was 2.5°F above normal.
The month of February to date can be included by using heating degree days instead of temperatures. Heating degree days are a measure designed to indicate to what degree the interior of buildings will require heating. To calculate it, average a day’s high and low temperature, then subtract the result from 65. This is how many heating degree days there were on that day. Now, to measure a period of time, simply sum the heating degree days for each day in the period.
The problem here is that the data in the climate summaries, where the heating degree data is published, use a different period to determine normal than does the data above. The data above uses values that run from when record keeping started to the current date. The climate summaries use data from 1981-2010. It was around 1980 that the effects of climate change really kicked in. This results in different estimates of “normal,” with the climate summary referencing only recent (warmer) history, and the other data referencing much longer (cooler) periods of time.
That said, it is the only way I can think of to include February for Kansas City in this discussion, so this is what the data shows:
|Observed Heating Degree Days||Normal Heating Degree Days||Difference|
|February 1-20 2019||731||657||
Looked at this way, it would appear that December created about 11% fewer degree days than normal, but January and February (to date) have created about 2% and 11% more, respectively. If you sum the differences for the 3 months together, then the winter to date has created 17 more heating degree days than normal, a trivial amount: in terms of heating degree days, Kansas City’s winter in 2018-19 should be understood to be roughly normal.
Now, none of this speaks to snow or blizzards. I understand that the winter storm at the end of January was a terrible event. In a similar fashion, I was in Hawaii when the winter cyclone came ashore in early February. I saw whole fields of banana trees leveled, just snapped off mid-trunk. On the top of Mauna Kea, the wind was recorded at 190+ mph. None of that changes the fact, however, that Hawaii has a lovely climate, and it was a wonderful place to visit (although too crowded these days, I’d say). Same in St. Louis. This blog is more concerned with statistical trends than individual events, and none of the statistics suggest that this has been, on average, a freakishly cold winter.
I read that people who believe in climate change are being peppered with the question “If the Earth is warming so much, how come it is so cold?” Nobody ever said that climate change would banish all cold, and the predictions are for more intense storms, just like the ones referenced above. But the real answer seems to be that it isn’t actually so cold, at least not here in Missouri. The whole question is nothing but phony baloney, at least here in Missouri.
NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, retrieved on February 21, 2018 from http://www.ncdc.noaa.gov/cag.
NOAA, National Weather Service, Kansas City/Pleasant Hill Forecast Office. 2/21/2019. Daily Climate Report. For this post, I used reports for 12/31/2018, 1/31/2019, and 2/20/2019. Viewed online 2/21/2019 https://w2.weather.gov/climate/index.php?wfo=eax.
NOAA, National Weather Service, St. Louis Forecast Office. 2/21/2019. Climate Graphs. Data retrieved on 2/21/2019 from https://www.weather.gov/lsx/cliplot.
2018 was the 3rd wettest year on record across the contiguous USA.
So says data from Climate-At-A-Glance, the data portal operated by the National Oceanographic and Atmospheric Administration (NOAA). Figure 1 shows the data, with the green line representing actual yearly precipitation, and the blue line representing the trend across time. The left vertical scale shows inches of precipitation, while the right shows millimeters of precipitation. In 2018, the average precipitation across the contiguous USA was 34.62 inches, which was the 3rd highest amount in the record. Over time, precipitation seems to be increasing at about 0.18 inches per decade. The trend towards more precipitation is present in the Eastern Climate Region (+0.30 inches per decade), the Southern Climate Region (+0.24 inches per decade), and the Central Climate Region (+0.23 inches per decade). It is almost absent in the Western Climate Region, however (+0.02 inches per decade). In fact, 2018 was a below-average precipitation year in the West. (Except where noted, data is from the Climate-at-a-Glance data portal.)
(Click on figure for larger view.)
In Missouri, 2018 was the 41st wettest year on record, with 43.04 inches of precipitation. (Figure 2) This puts the year 2.54 inches above the long-term average. As expected, the variation from year-to-year is much larger than the change in precipitation over time, but since 1895 Missouri has trended towards about 0.24 inches more precipitation per decade.
Unlike 2016 and 2017, 2018 did not bring epic flooding to Missouri. Perhaps the most notable thing about Missouri precipitation in 2018 were two almost out of season snow events – one over the Easter weekend in April, and one in mid-November. The latter heralded what has been a very snowy winter so far in 2019 for Missouri and much of the Midwest.
The Northern Rockies and Plains are where most of the water that flows into the Missouri River originates, and the Missouri River provides water to more Missourians than any other source. This region saw 24.83 inches of precipitation in 2017, some 5.82 inches above average. (Figure 3) As expected, the variation between years is much larger than the change over time, but here, too, precipitation has been increasing, though the change has only been +0.07 inches per decade.
What to watch for in Missouri, then, does not appear to be a decrease in average yearly precipitation, but two other issues. First, demand for water has been increasing. Will it grow to outstrip the supply? Second, this winter notwithstanding, climate change is causing precipitation that once fell as snow to fall as rain. This changes the timing of when the Missouri River receives the runoff. Will that affect the ability of the river to supply water to meet the various demands? So far, these answers are not known. (For a more extended discussion, see here.)
The water situation in California is more serious than it is in the Northern Rockies and Plains, Missouri, or contiguous USA. California has a monsoonal precipitation pattern, and it has regions that receive a great deal of precipitation, while other regions receive little, if any. Consequently, the state relies on snowfall during the winter, which runs off during the spring and early summer, and is collected into reservoirs. This water is then distributed around the state. Thus, the amount of water contained in the snowpack on April 1, which is when it historically started melting in earnest, has been seen to be crucial to California’s water status.
After a big water year in 2017, 2018 returned to below-average precipitation. It was the 34th driest year on record, with precipitation 4.54 inches below average. (Figure 4)
As I reported previously, the California snow season started slowly this winter. It has been catching up, and is now nearly average for this date. The snowpack is above average in the Colorado River Basin above Lake Powell, the other major source for California’s water. The snowpack is 110% of the average for this date. (National Resource Conservation Service, 2/14/2018).
California Data Exchange Center, Department of Water Resources. Current Year Regional Snow Sensor Water Content Chart (PDF). Downloaded 1/22/2018 from https://cdec.water.ca.gov/water_cond.html.
Mammoth Mountain Ski Area. 2018. Snow Conditions and Weather: Snow History. Viewed online 1/15/2018 at NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, published January 2018, retrieved on January 15, 2018 from http://www.ncdc.noaa.gov/cag.
Natural Resource Conservation Service, U.S. Department of Agriculture. Upper Colorado River Basin SNOTEL Snowpack Update Report. Viewed online 1/28/2018 at https://wcc.sc.egov.usda.gov/reports/UpdateReport.html?textReport.
NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, published January 2018, retrieved on January 15, 2018 from http://www.ncdc.noaa.gov/cag.
2018 was the 4th warmest year on record globally, and the 14th warmest for the contiguous USA.
Figure 1 shows the average annual temperature for the Earth for each year from 1880-2018. The chart shows the temperature as an anomaly. That means that they calculated the mean annual temperature for the whole series, and then presented the data as a deviation from that mean. Degrees Celsius are on the left vertical axis, and degrees Fahrenhiet are on the right. Because the earth contains very hot regions near the equator and very cold polar regions, the actual mean temperature has relatively little meaning, and Climate-at-a Glance does not include it in their chart. (All data is from NOAA, Climate at a Glance.) 2016 was the highest on record, and the 4 highest readings have all occurred within the last 4 years. You can see that the Earth appears to have been in a cooling trend until around 1910, then a warming trend until mid-Century, then a cooling period until the late 1960s or early 1970s, and then a warming period since 1970. Over the whole series, the warming trend has been 0.07°C per decade, which equals 0.13°F per decade. Since 1970, however, the warming has accelerated to 0.17°C per decade (0.30°F).
(Click on chart for larger view.)
Figure 2 shows the average yearly temperature for the contiguous United States from 1895 to 2018. In this chart and those that follow, the vertical axes are reversed, with °F on the left vertical axis, and °C on the right. The purple line shows the data, and the blue line shows the trend. 2018 was the 14th highest in the record at 54.58°F. The 4 highest readings have all come within the last decade. Over the whole series, the average temperature has increased 0.15°F per decade. Since 1970, however, the rate has increased substantially.
Figure 3 shows the average temperature across Missouri for 2018. Across the state, it was the 35th warmest year on record, with an average temperature of 55.2°F. In Missouri, the warming trend from 1930-1950 was more marked than it was nationally; across the whole time period, the trend has been for a 0.1°F increase in temperature each decade. As was the case nationally, since 1970 the increase has accelerated.
Because conditions in the Northern Rockies and Plains affect how much water flows into the Missouri River, which provides more of Missouri’s water supply than any other source, I have also tracked climate statistics for that region. Figure 4 shows the data. Last year was slightly above average for this region. This region has been warming at a rate of 0.2°F per decade over the whole period, and, since 1970, the rate has accelerated substantially.
Because I have been concerned about the water supply in California, I also track the climate statistics for that state. Figure 5 shows the data. Last year was the 3rd warmest year in the record, with an average temperature of 60.2°F. California has been warming at a rate of 0.2°F each decade. Since 1970 the rate of increase has accelerated substantially.
In all 4 locations the average yearly temperature seems to have increased significantly for several decades, then paused during mid-Century, and then resumed climbing, but at an accelerated rate. There seems to be little doubt that across the country it is warmer than it was. In Missouri, the average yearly temperature has been increasing, but at a rate that is somewhat less than in the other locations I looked at.
NOAA National Centers for Environmental Information, Climate at a Glance. Retrieved on February 1, 2019, from http://www.ncdc.noaa.gov/cag.
Followers of this blog know that I usually report on the snowpack conditions in California once or twice during the winter. I do this because I have family living in California, and because California constitutes an incredibly large percentage of this nation’s economy, and because it provides an incredibly large percentage of the fruits and vegetables we eat.
California depends on its snowpack for about 30% of its water supply. Climate change will reduce the California snowpack by as much as 40%, it is projected, putting the state’s water supply at risk. The snowpack is projected to decline mostly because the increased temperature will cause precipitation to fall as rain instead of snow, and because it will cause increased melting during the winter months.
The snow year got off to a slow start in California this year. The January 24 measurements showed the snowpack at 56-63% of average for that date, depending on which hydrologic region was measured. (See Figure 1.)
(Click on figure for larger view.)
The snowpack is below average despite the fact that California has had significant precipitation. November and December were slow, but at Mammoth Mountain, a ski resort located in the Sierra Nevadas, they have had 93 inches of snow this January. That puts the total for the winter within 2 inches of average for this date, and the month still has 4 days to go as I write.( See Figure 2.)
Because California receives most of its water during the winter, but needs it most during the summer, the state operates many reservoirs. The water content of the most important are shown in Figure 3. The Lake Shasta, Trinity, and Oroville Reservoirs are the three largest, and thus, the most important. In general, the reservoirs appear to be close to normal levels for this time of year. The one exception is the Oroville Reservoir, which is at 39% of capacity. This reservoir was severely damaged in 2017, when storms caused emergency releases, which eroded away major portions of the dam. Repairs were completed last fall, but the dam has yet to refill. Whether it will ever refill given California’s reduction in snowpack is an interesting question to which I don’t know the answer. Oroville is the major water source for the California State Water Project, and, thus, it is important.
The Colorado River is another important source of water for California, and Lake Mead is the principal reservoir at which its level is measured. It is at 1085.21 ft. above sea level, or 40% of its capacity. For this date the average is 1159 ft. above sea level, and the historical low was 1083.46 ft. above sea level, reached in 2016. Lake Mead can be recharged with water from Lake Powell, and that reservoir is also at 40% of capacity. It is usual for these reservoirs to be low during the late fall, and then recharge during the winter. However, Lake Mead continues to flirt with historical lows and with the level at which mandatory water restrictions go into effect.
The bottom line here is that California’s water supply continues to be below historical levels, though not quite as low as during the terrible drought a few years ago. As of right now, signs do not point to a severe water crisis this year, but the state continues to walk a rather fine line. Should drought recur, a severe crisis is likely to occur..
Alexander, Kurtis. 2018. “Oroville Dam Fixed and Ready to Go, Officials Say – But at a Big Price.” San Francisco Chronicle. 10/31/2018.
California Data Exchange Center. California Snowpack Water Content, January 24, 2019, Percent of April 1 Level. Downloaded 1.27.2019 from https://cdec.water.ca.gov/reportapp/javareports?name=PLOT_SWC.pdf.
California Department of Water Resources. Current Reservoir Conditions. Downloaded 1/27/2019 from http://cdec.water.ca.gov/reportapp/javareports?name=rescond.pdf.
Lake Mead Water Database. Viewed online 1/27/2019 at http://lakemead.water-data.com.
Mammoth Mountain Ski Resort. Snow & Weather Report. Viewed online 1/27/2019 at https://www.mammothmountain.com/winter/mountain-information/mountain-information.