Lately, it feels like we’re hearing about wildfires erupting in the western United States more often. But how have wildfire occurrences changed over the decades?
Researchers with the NASA-funded Rehabilitation Capability Convergence for Ecosystem Recovery (RECOVER) have analyzed more than 40,000 fires from Colorado to California between 1950 to 2017 to learn how wildfire frequency, size, location, and a few other traits have changed.
Here are six trends they have observed in the western United States:
1. There are more fires.
Over the past six decades, there has been a steady increase in the number of fires in the western U.S. In fact, the majority of western fires—61 percent—have occurred since 2000 (shown in the graph below).
Those fires are also burning more acres of land. The average annual amount of acres burned has been steadily increasing since 1950. The number of megafires—fires that burn more than 100,000 acres (156 square miles)—has increased in the past two decades. In fact, no documented megafires occurred before 1970.
Source: NASA RECOVER / Keith Weber
The recent increase in fire frequency and size is likely related to a few reasons, including the rise of global temperatures since the start of the new millennia. Seventeen of the 18 warmest years on record have occurred since 2001.
Global temperatures can affect local fire conditions. Amber Soja, a wildfire expert at NASA’s Langley Research Center, said fire-weather conditions—high temperatures, low relative humidity, high wind speed, and low precipitation—can increase dryness and make vegetation in the west easier to burn. “Those fire conditions all fall under weather and climate,” said Soja. “The weather will change as Earth warms, and we’re seeing that happen.”
3. A small percentage of the West has burned.
Even though fire frequency and size has increased, only a small percentage of western lands— 11 percent—has burned since 1950. In this map, wildfires are shown in orange. Private lands are shown in purple while public lands are clear (no color). The location of wildfires was random; that is, there was no bias toward fires affecting private or public land.
Keith Weber, a professor at Idaho State University who led the analysis, was surprised at the 11 percent figure. There’s no clear reason yet for why more of the region hasn’t burned. “Some of the 89% may not burn because it has low susceptibility—not dry enough or it has low fuel (vegetation),” said Weber. “Some areas may be really ripe for a fire, but they have not had an ignition source yet.”
4. The same areas keep burning.
How has only 11 percent of the west burned, yet the annual number of acres burned and the frequency of fire increased? It turns out that many fires are occurring in areas that have already experienced fires, known as burn-on-burn effects. About 3 percent—almost a third of the burned land—has seen repeated fire activity.
The map here shows the locations of repeated fire activity. While you can’t see it at this map’s resolution, some areas have experienced as many as 11 fires since 1950. In those areas, fires occurred about every seven years, said Weber, which is about the amount of time it takes for an ecosystem to build up enough vegetation to burn again.
5. Recent fires are burning more coniferous forests than other types of landscape.
Since 2000, wildfires have shifted from burning shrub-lands to burning conifers. The Southern Rocky Mountains Ponderosa Pine Woodland landscape has experienced the most acres burned—more than 3 million.
The reason might lie within the tree species. Ponderosa Pine is a fire-adapted species. With its thick and flaky bark, the tree can withstand low-intensity surface fires. It also drops branches lower as they age, which deters fire from climbing up the tree and burning their green needles. “The fire will remove forest undergrowth, but will be just fine for the pines,” said Weber. “We are starting to see Ponderosa Pines thrive in those areas.”
Source: National Park Service
6. Wildfires are going to have a big impact on our future.
Research suggests that global warming is predicted to increase the number of very large fires (more than 50,000 acres) in the western United States by the middle of the century (2041-2070).
The map below shows the projected increase in the number of “very large fire weeks”—periods where conditions will be conducive to very large fires—by mid-century (2041-2070) compared to the recent past (1971-2000). The projections are based on scenarios where carbon dioxide emissions continue to increase.
Wildfires are expected to further stress our nation’s “aging and deteriorating infrastructure.”
Smoke from wildfires is expected to impair outdoor recreational activities.
Wildfires on rangelands are expected to disrupt the U.S.’s agricultural productivity, creating challenges to livestock health, declining crop yields and quality, and affecting sustainable food security and price stability.
Increased wildfire activity is “expected to decrease the ability of U.S. forests to support economic activity, recreation, and subsistence activities.”
More about the source data:
Unless otherwise stated in the article, these data come from NASA’s Rehabilitation Capability Convergence for Ecosystem Recovery. RECOVER is an online mapping tool that pulls together data on 26 different variables useful for fires managers, such as burn severity, land slope, vegetation, soil type, and historical wildfires. In the past, fire managers might need several days or weeks to assemble and present such a large amount of information. RECOVER does so in five minutes, with the help of sophisticated server technologies that gather data from a multitude of sources. Funded by NASA’s Applied Science Program, RECOVER provides these data on specific fires to help fire managers to start rehabilitation plans earlier and implement recovery efforts quickly.
In October 2018, the Intergovernmental Panel on Climate Change (IPCC) released yet another sobering report about the planetary disruption happening because of all the carbon human activity puts into the atmosphere.
Many parts of the world are already seeing rising sea levels, hotter temperatures, more extreme precipitation and droughts, more acidic oceans, and faster rates of extinctions, the scientists said. And without dramatic reductions in carbon emissions to keep warming below 1.5 degrees Celsius, the problems are going to get far worse.
In just one sentence, Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies, captured the essence of the report. “The key thing to remember is that it’s clear that the best time to have reduced emissions was 25 years ago,” he said during an interview with PBS News Hour. “But the second best time to reduce emissions is right now.”
Like any good scientist, Schmidt is always quick to give credit where credit is due. In this case, he noted that what he said on television was a riff on something that renowned Kenyan marathoner Eliud Kipchoge once said. “The best time to plant a tree was 25 years ago. The second-best time to plant a tree is today,” The New York Times quoted the marathon world record holder as saying.
If you are wondering what the IPCC authors based their findings on, there is no shortage of information to explore. Each chapter of the report has a supplementary information section with dozens of references that detail the evidence the scientists used to draw their conclusions.
According to an ongoing temperature analysis conducted by scientists at NASA’s Goddard Institute for Space Studies, the average global temperature on Earth has increased by about 0.8° Celsius (1.4° Fahrenheit) since 1880. Two-thirds of the warming has occurred since 1975, at a rate of roughly 0.15-0.20°C per decade. Read more about this map here.
A posthumous plea from Sellers arrived in July 2018 in the form of an article in PNAS. The topic was one that he cared deeply about: building a better space-based system for observing and understanding the carbon cycle and its climate feedbacks.
As NASA’s Patrick Lynch reported, Sellers wrote the paper along with colleagues at NASA’s Jet Propulsion Laboratory and the University of Oklahoma. Work on the paper began in 2015, and Sellers continued working with his collaborators up until about six weeks before he died. They carried on the research and writing of the paper until its publication in July 2018.
The carbon cycle refers to the constant flow of carbon between rocks, water, the atmosphere, plants, soil, and fossil fuels. Climate change feedbacks—natural effects that may amplify or diminish the human emissions of greenhouse gases—are one of the most poorly understood aspects of climate science.
Here is how Sellers and colleagues characterized the current state of the carbon cycle in the PNAS article:
“It is quite remarkable and telling that human activity has significantly altered carbon cycling at the planetary scale. The atmospheric concentrations of carbon dioxide (CO2) and methane (CH4) have dramatically exceeded their envelope of the last several million years.”
They also explain in detail how we have altered the carbon cycle:
“The perturbation by humans occurs first and foremost through the transfer of carbon from geological reservoirs (fossil fuels) into the active land–atmosphere–ocean system and, secondarily, through the transfer of biotic carbon from forests, soils, and other terrestrial storage pools (e.g., industrial timber) into the atmosphere.”
Scientists understand the broad outlines of how this works relatively well. What worried Sellers was the potential curve balls the climate might throw at us with unanticipated feedbacks. They addressed some of the the challenges in understanding how climate change might affect concentrations of carbon dioxide and methane through feedbacks.
For carbon dioxide:
“While experimental studies consistently show increases in plant growth rates under elevated CO2 (termed carbon dioxide fertilization), the extrapolation of even the largest-scale experiments to ecosystem carbon storage is problematic, and some ecologists have argued that the physiological response could be eliminated entirely by restrictions due to limitation by nutrients or micronutrients. However, there is recent evidence from the atmosphere that suggests increasing CO2 enhances terrestrial carbon storage, leading to the continued increase in land uptake paralleling CO2 concentrations.”
As we detailed in a separate story, the situation is even more complicated for methane. Sellers and his colleagues explained some of the challenges in understanding the feedbacks that affect that potent greenhouse gas this way:
“Atmospheric methane is currently at three times its preindustrial levels, which is clearly driven by anthropogenic emissions, but equally clearly, some of the change is because of carbon-cycle–climate feedbacks. Atmospheric CH4 rose by about 1 percent per year in the 1970s and 1980s, plateaued in the 1990s, and resumed a steady rise after 2006. Why did the plateau occur? These trends in atmospheric methane concentration are not understood. They may be due to changes in climate: increases in temperature, shifts in the precipitation patterns, changes to wetlands, or proliferation in the carbon availability to methane-producing bacteria.”
The consequences of the gaps in understanding could be significant.
“Terrestrial tropical ecosystem feedbacks from the El Nino drove an ∼2-PgC increase in global CO2 emissions in 2015. If emissions excursions such as this become more frequent or persistent in the future, agreed-upon mitigation commitments could become ineffective in meeting climate stabilization targets. Earth system models disagree wildly about the magnitude and frequency of carbon–climate feedback events, and data to this point have been astonishingly ineffective at reducing this uncertainty.”
NASA’s current missions and partnership missions in orbit. Credit: NASA
Sellers and his colleagues do offer a solution. It has much to do with satellites.
“Space-based observations provide the global coverage, spatial and temporal sampling, and suite of carbon cycle observations required to resolve net carbon fluxes into their component fluxes (photosynthesis, respiration, and biomass burning). These space-based data substantially reduce ambiguity about what is happening in the present and enable us to falsify models more effectively than previous datasets could, leading to more informed projections.”
Hydrologist Matt Rodell of NASA’s Goddard Spaceflight Center has been living with first-of-its-kind data from the Gravity Recovery and Climate Experiment (GRACE) for 16 years. That data shows big changes of mass in specific spots on Earth, primarily the result of the movement of water and ice, but it doesn’t tell them what causes those changes. That’s where Matt and the GRACE team come in, painstakingly connecting these observed changes to the loss of ice sheets, depleting aquifers, and climate change. It’s a problem they’re still working on, getting closer every day. Matt explains the years-long process in his own words.
Ominous beginning: Garbage data from a new satellite
Six months after GRACE launched in March 2002, we got our first look at the data fields. They had these big vertical, pole-to-pole stripes that obscured everything. We’re like, holy cow this is garbage. All this work and it’s going to be useless.
But it didn’t take the science team long to realize that they could use some pretty common data filters to remove the noise, and after that they were able to clean up the fields and we could see quite a bit more of the signal. We definitely breathed a sigh of relief. Steadily over the course of the mission, the science team became better and better at processing the data, removing errors, and some of the features came into focus. Then it became clear that we could do useful things with it.
And then trends emerged
It only took a couple of years. By 2004, 2005, the science team working on mass changes in the Arctic and Antarctic could see the ice sheet depletion of Greenland and Antarctica. We’d never been able before to get the total mass change of ice being lost. It was always the elevation changes – there’s this much ice, we guess – but this was like wow, this is the real number.
Not long after that we started to see, maybe, that there were some trends on the land, although it’s a little harder on the land because with terrestrial water storage — the groundwater, soil moisture, snow, and everything. There’s inter-annual variability, so if you go from a drought one year to wet a couple years later, it will look like you’re gaining all this water, but really, it’s just natural variability.
By around 2006, there was a pretty clear trend over Northern India. At the GRACE science team meeting, it turned out another group had noticed that as well. We were friendly with them, so we decided to work on it separately. Our research ended up being published in 2009, a couple years after the trends had started to become apparent. By the time we looked at India, we knew that there were other trends around the world. Slowly not just our team but all sorts of teams, all different scientists around the world, were looking at different apparent trends and diagnosing them and trying to decide if they were real and what was causing them.
A world of big blobs of red and blue
I think the map, the global trends map, is the key. By 2010 we were getting the broad-brush outline, and I wanted to tell a story about what is happening in that map. For me the easiest way was to just look at the data around the continents and talk about the major blobs of red or blue that you see and explain each one of them and not worry about what country it’s in or placing it in a climate region or whatever. We can just draw an outline around these big blobs. Water is being gained or lost. The possible explanations are not that difficult to understand. It’s just trying to figure out which one is right.
Not everywhere you see as red or blue on the map is a real trend. It could be natural variability in part of the cycle where freshwater is increasing or decreasing. But some of the blobs were real trends. If it’s lined up in a place where we know that there’s a lot of agriculture, that they’re using a lot of water for irrigation, there’s a good chance it’s a decreasing trend that’s caused by human-induced groundwater depletion.
And then, there’s the question: are any of the changes related to climate change? There have been predictions of precipitation changes, that they’re going to get more precipitation in the high latitudes and more precipitation as rain as opposed to snow. Sometimes people say that the wet get wetter and the dry get dryer. That’s not always the case, but we’ve been looking for that sort of thing. These are large-scale features that are observed by a relatively new satellite system and we’re lucky enough to be some of the first to try and explain them.
What kept me up at night
The past couple years when I’d been working the most intensely on the map, the best parts of my time in the office were when I was working on it. Because I’m a lab chief, I spend about half my time on managerial and administrative things. But I love being able to do the science, and in particular this, looking at the GRACE data, trying to diagnose what’s happening, has been very enjoyable and fulfilling. We’ve been scrutinizing this map going on eight, nine years now, and I really do have a strong connection to it.
What kept me up at night was finding the right explanations and the evidence to support our hypotheses – or evidence to say that this hypothesis is wrong and we need to consider something else. In some cases, you have a strong feeling you know what’s happening but there’s no published paper or data that supports it. Or maybe there is anecdotal evidence or a map that corroborates what you think but is not enough to quantify it. So being able to come up with defendable explanations is what kept me up at night. I knew the reviewers, rightly, couldn’t let us just go and be completely speculative. We have to back up everything we say.
A tangled mix of answers
The world is a complicated place. I think it helped, in the end, that we categorized these changes as natural variability or as a direct human impact or a climate change related impact. But then there can be a mix of those – any of those three can be combined, and when they’re combined, that’s when it’s more difficult to disentangle them and say this one is dominant or whatever. It’s often not obvious. Because these are moving parts and particularly with the natural variability, you know it’s going to take another 15 years, probably the length of the GRACE Follow-On mission, before we become completely confident about some of these. So it’ll be interesting to return to this in 15 years and see which ones we got right and which ones we got wrong.
NASA Earth Observatory readers may recognize this image of a long trail of clouds — an atmospheric river — reaching across the Pacific Ocean toward California. It appeared first as an Image of the Day about how these moisture superhighways fueled a series of drought-busting rain and snow storms.
More recently, we were pleased to see that image on the cover of the Fourth National Climate Assessment — a major report issued by the U.S. Global Research Program. That image was one of many from Earth Observatory that appeared in the report. Since the authors did not give much background about the images, here is a quick rundown of how they were created and how they fit with some of the key points on our changing climate.
Hurricanes in the Atlantic
Found in Chapter 1: Our Globally Changing Climate
What the image shows:
Three hurricanes — Katia, Irma, and Jose — marching across the Atlantic Ocean on September 6, 2017.
What the report says about tropical cyclones and climate change:
The frequency of the most intense hurricanes is projected to increase in the Atlantic and the eastern North Pacific. Sea level rise will increase the frequency and extent of extreme flooding associated with coastal storms, such as hurricanes.
How the image was made:
The Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite collected the data. Earth Observatory staff combined several scenes, taken at different times, to create this composite. Original source of the image: Three Hurricanes in the Atlantic
The North Pole
Found in Chapter 2: Physical Drivers of Climate Change
What the image shows:
Clouds swirl over sea ice, glaciers, and green vegetation in the Northern Hemisphere, as seen on a spring day from an angle of 70 degrees North, 60 degrees East.
What the report says about climate change and the Arctic:
Over the past 50 years, near-surface air temperatures across Alaska and the Arctic have increased at a rate more than twice as fast as the global average. It is very likely that human activities have contributed to observed Arctic warming, sea ice loss, glacier mass loss, and a decline in snow extent in the Northern Hemisphere.
How it was made:
Ocean scientist Norman Kuring of NASA’s Goddard Space Flight Center pieced together this composite based on 15 satellite passes made by VIIRS/Suomi NPP on May 26, 2012. The spacecraft circles the Earth from pole to pole, so it took multiple passes to gather enough data to show an entire hemisphere without gaps. Original source of the image: The View from the Top
Found in Chapter 3: Detection and Attribution of Climate Change
What the image shows:
Columbia Glacier in Alaska, one of the most rapidly changing glaciers in the world.
What the report says about Alaskan glaciers and climate change:
The collective ice mass of all Arctic glaciers has decreased every year since 1984, with significant losses in Alaska.
How the image was made:
NASA Earth Observatory visualizers made this false-color image based on data collected in 1986 by the Thematic Mapper on Landsat 5. The image combines shortwave-infrared, near-infrared, and green portions of the electromagnetic spectrum. With this combination, snow and ice appears bright cyan, vegetation is green, clouds are white or light orange, and open water is dark blue. Exposed bedrock is brown, while rocky debris on the glacier’s surface is gray. Original source of the image: World of Change: Columbia Glacier
Found in: Intro to Chapter 4: Climate Models, Scenarios, and Projections
What the image shows:
Sea ice hugging the Russian coastline and cloud streets streaming over the Bering Sea.
What the report says about clouds and climate change:
Climate feedbacks are the largest source of uncertainty in quantifying climate sensitivity — that is, how much global temperatures will change in response to the addition of more greenhouse gases to the atmosphere.
How it was made:
The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite captured this natural-color image on January 4, 2012. The LANCE/EOSDIS MODIS Rapid Response Team generated the image, and NASA Earth Observatory staff cropped and labeled it. Original source of the image: Cloud streets over the Bering Sea
Found in Intro to Chapter 5: Large-scale circulation and climate variability
What it shows:
Two extratropical cyclones, the cause of most winter storms, churned near each other off the coast of South Africa in 2009.
What the report says about extratropical storms and climate change:
There is uncertainty about future changes in winter extratropical cyclones. Activity is projected to change in complex ways, with increases in some regions and seasons and decreases in others. There has been a trend toward earlier snowmelt and a decrease in snowstorm frequency on the southern margins of snowy areas. Winter storm tracks have shifted northward since 1950 over the Northern Hemisphere.
How the image was made:
The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite captured this natural-color image. The LANCE/EOSDIS MODIS Rapid Response Team generated the image and NASA Earth Observatory staff cropped and labeled it. Original source of the image: Cyclonic Clouds over the South Atlantic Ocean
Sea of Sand
Found in: Chapter 6: Temperature Changes in the United States
What the image shows: Large, linear sand dunes alternating with interdune salt flats in the Rub’ al Khali in the Sultanate of Oman.
What the report says about drought, dust storms, and climate change:
The human effect on droughts is complicated. There is little evidence for a human influence on precipitation deficits, but a lot of evidence for a human fingerprint on surface soil moisture deficits — starting with increased evapotranspiration caused by higher temperatures. Decreases in surface soil moisture over most of the United States are likely as the climate warms. Assuming no change to current water resources management, chronic hydrological drought is increasingly possible by the end of the 21st century. Changes in drought frequency or intensity will also play an important role in the strength and frequency of dust storms.
How it was made: An astronaut on the International Space Station took the photograph with a Nikon D3S digital camera using a 200 millimeter lens on May 16, 2011. Original source of the image: Ar Rub’ al Khali Sand Sea, Arabian Peninsula
Flooding on the Missouri River
Found in Chapter 7: Precipitation Change in the United States
What the image shows:
Sediment-rich flood water lingering on the Missouri River in July 2011.
What the report says about precipitation, floods, and climate change:
Detectable changes in flood frequency have occurred in parts of the United States, with a mix of increases and decreases in different regions. Extreme precipitation, one of the controlling factors in flood statistics, is observed to have generally increased and is projected to continue to do. However, scientists have not yet established a significant connection between increased river flooding and human-induced climate change.
How the image was made:
The Advanced Land Imager (ALI) on NASA’s Earth Observing-1 (EO-1) satellite captured the data for this natural-color image. NASA Earth Observatory staff processed, cropped, and labeled the image. Original source of the image: Flooding near Hamburg, Iowa
Smoke and Fire
Found in Chapter 8: Droughts, Floods, and Wildfires
What the image shows: Smoke streaming from the Freeway fire in the Los Angeles metro area on November 16, 2008.
What the report says about wildfires and climate change: The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase as the climate warms, with profound changes to certain ecosystems. However, other factors related to climate change — such as water scarcity or insect infestations — may act to stifle future forest fire activity by reducing growth or otherwise killing trees.
How it was made: The MODIS Rapid Response Team made this image based on data collected by NASA’s Aqua satellite. Original source of the image: Fires in California
The Colorado River and Grand Canyon Found in Chapter 10: Changes in Land Cover and Terrestrial Biogeochemistry
What the image shows:
The Grand Canyon in northern Arizona.
What the report says about climate change and the Colorado River: The southwestern United States is projected to experience significant decreases in surface water availability, leading to runoff decreases in California, Nevada, Texas, and the Colorado River headwaters, even in the near term. Several studies focused on the Colorado River basin showed that annual runoff reductions in a warmer western U.S. climate occur through a combination of evapotranspiration increases and precipitation decreases, with the overall reduction in river flow exacerbated by human demands on the water supply.
How the image was made:
On July 14, 2011, the ASTER sensor on NASA’s Terra spacecraft collected the data used in this 3D image. NASA Earth Observatory staff made the image by draping an ASTER image over a digital elevation model produced from ASTER stereo data. Original source of the image: Grand New View of the Grand Canyon
Arctic Sea Ice
Found in Chapter 11: Arctic Changes and their Effects on Alaska and the Rest of the United States
What the image shows: A clear view of the Arctic in June 2010. Clouds swirl over sea ice, snow, and forests in the far north.
What the report says about sea ice and climate change: Since the early 1980s, annual average Arctic sea ice has decreased in extent between 3.5 percent and 4.1 percent per decade, become 4.3 to 7.5 feet (1.3 and 2.3 meters) thinner. The ice melts for at least 15 more days each year. Arctic-wide ice loss is expected to continue through the 21st century, very likely resulting in nearly sea ice-free late summers by the 2040s.
How it was made: Earth Observatory staff used data from several MODIS passes from NASA’s Aqua satellite to make this mosaic. All of the data were collected on June 28, 2010. Original source of the image: Sunny Skies Over the Arctic
Crack in the Larsen C Ice Shelf
Found in Chapter 12: Sea Level Rise
What the image shows:
This photograph shows a rift in the Larsen C Ice Shelf as observed from NASA’s DC-8 research aircraft. An iceberg the size of Delaware broke off from the ice shelf in 2017.
What the report says about ice shelves in Antarctica and climate change?
Floating ice shelves around Antarctica are losing mass at an accelerating rate. Mass loss from floating ice shelves does not directly affect global mean sea level — because that ice is already in the water — but it does lead to the faster flow of land ice into the ocean.
How it was made:
NASA scientist John Sonntag took the photo on November 10, 2016, during an Operation IceBridge flight. Original source of the image: Crack on Larsen C
The Gulf of Mexico
Found in Chapter 13: Ocean Acidification and Other Changes
What the image shows:
Suspended sediment in shallow coastal waters in the Gulf of Mexico near Louisiana.
What the report says about the Gulf of Mexico:
The western Gulf of Mexico and parts of the U.S. Atlantic Coast (south of New York) are currently experiencing significant sea level rise caused by the withdrawal of groundwater and fossil fuels. Continuation of these practices will further amplify sea level rise.
How the image was made:
The MODIS instrument on NASA’s Aqua satellite captured this natural-color image on November 10, 2009. Original source of the image: Sediment in the Gulf of Mexico
What the image shows: A fall scene showing farmland in the Page Valley of Virginia, between Shenandoah National Park and Massanutten Mountain.
What the report says about farming and climate change: Since 1901, the consecutive number of frost-free days and the length of the growing season have increased for the seven contiguous U.S. regions used in this assessment. However, there is important variability at smaller scales, with some locations actually showing decreases of a few days to as much as one to two weeks. However, plant productivity has not increased, and future consequences of the longer growing season are uncertain.
How the image was made: On October 21, 2013, the Operational Land Imager (OLI) on Landsat 8 captured a natural-color image of these neighboring ridges. The Landsat image has been draped over a digital elevation model based on data from the ASTER sensor on the Terra satellite. Original source of the image: Contrasting Ridges in Virginia
What the image shows: A tight arc of clouds stretching from Hawaii to California, which is a visible manifestation of an atmospheric river of moisture flowing into western states.
What the report says about atmospheric rivers and climate change:
The frequency and severity of land-falling atmospheric rivers on the U.S. West Coast will increase as a result of increasing evaporation and the higher atmospheric water vapor content that occurs with increasing temperature. Atmospheric rivers are narrow streams of moisture that account for 30 to 40 percent of the typical snow pack and annual precipitation along the Pacific Coast and are associated with severe flooding events.
How it was made: On February 20, 2017, the VIIRS on Suomi NPP captured this natural-color image of conditions over the northeastern Pacific. NASA Earth Observatory data visualizers stitched together two scenes to make the image. Original source of the image: River in the Sky Keeps Flowing Over the West
A June snowstorm just topped off the already thick layer of white stuff atop the Sierra Nevadas. California’s snow water equivalent rose to a heaping 170 percent of normal. But not so long ago, the state was in the midst of a deep drought; its mountains were bare and brown, and water levels plummeted in reservoirs.
Throughout, satellites were watching. Check out the California drought and its aftermath in a video from NASA Earth Observatory:
Haze over northeastern China on January 14, 2013. Image by NASA Earth Observatory, using data Terra MODIS data from LANCE MODIS Rapid Response.
In the winter of 2013, thick haze enveloped northern China for several weeks. On January 12, 2013, the peak of that bad-air episode, the air quality index (AQI) rose to a staggering 775—off the U.S. Environmental Protection Agency scale—according to a U.S. air quality sensor in Beijing.
Extra pollution from cars, homes, and factories in the winter often sets the stage for outbreaks of air pollution in China. But a March 2017 study in Science Advances suggests that a loss of Arctic sea ice in 2012 and increased Eurasian snowfall the winter before may have helped fuel the extreme event.
Snow and ice cover can affect weather patterns because both affect albedo, a measure of how much solar radiation the surface reflects in comparison to how much incoming solar radiation it receives. In September 2012, sea ice covered less area than at any other time since 1979. Meanwhile, Eurasia had unusually high snow cover in December 2012, the second most on a record that dates back to 1967.
Normally, winds blow air pollution away from eastern China, which is home to Beijing and several other large cities. But in January 2013, winds died down to a whisper and air pollution piled up. By analyzing decades of data collected by ground-based weather stations, 15 years of satellite data on aerosols, and computer simulations of the atmosphere, the researchers concluded that unusual sea ice and snow conditions triggered a shift in China’s winter monsoon, stilling the winds that normally ventilate Beijing.
A press release from Georgia Tech explained the connection in more detail:
“The reductions in sea ice and increases in snowfall have the effect of damping the climatological pressure ridge structure over China,” explained Yuhang Wang. “That flattens the temperature and pressure gradients and moves the East Asian Winter Monsoon to the east, decreasing wind speeds and creating an atmospheric circulation that makes the air in China more stagnant.”
If correct, this might explain why efforts to reduce air pollution in recent years have not stopped extreme haze events from happening. “Emissions in China have been decreasing over the last four years, but the severe winter haze is not getting better,” said Wang. “Mostly, that’s because of a very rapid change in the high polar regions.”
This is not the first study that connects changes in the Arctic to severe haze in China. Research published in August 2015 in Atmospheric Oceanic Science Letters argued that a decline in Arctic sea ice intensifies haze in eastern China. And a study published in Nature Climate Change in April 2017 came to a similar conclusion. The latter study projected a 50 percent increase in the frequency of extreme haze events and an 80 percent increase in their persistence in the near future.
In 2012, Arctic sea ice extent was unusually low in September. New research suggests that may have contributed to a bad haze outbreak in eastern China the next winter. (NASA Earth Observatory graph by Joshua Stevens, based on data from the National Snow and Ice Data Center.)
Decades or even centuries from now, when enough time has passed for historians to write definitive accounts of global warming and climate change, two names are likely to make it into history books: Syukuro Manabe and Richard Wetherald.
In the late-1960s, these two scientists from the Geophysical Fluid Dynamics Laboratory developed one of the very first climate models. In 1967, they published results showing that global temperatures would increase by 2.0 degrees Celsius (3.6 degrees Fahrenheit) if the carbon dioxide content of the atmosphere doubled.
The top map shows temperature changes in one of Manabe’s coupled atmosphere–ocean models. It depicts what would happen by the 70th year of a global warming experiment when atmospheric concentrations of carbon dioxide are doubled. The model’s predictions match closely with levels of warming observed in the real world. The lower image shows observed change from a 30-year base period around 1961–1990 and 1991–2015. The map was obtained using the historical surface temperature dataset HadCRUT4. Figure published in Stouffer & Manabe, 2017.
As Ethan Siegel pointed out in Forbes, carbon dioxide content has risen by roughly 50 percent since the pre-industrial era. Observed temperatures, meanwhile, have increased by 1°C (1.8°F). For a pair of scientists working in a time when computer instructions were compiled on printed punch cards and processing was thousands of times slower than today, they created a remarkably accurate model. That is not to say that Manabe claims his 1967 model is perfect. In fact, he is quick to point out that there are some aspects that his and other climate models still get wrong. In an interview with CarbonBrief, he put it this way:
Models have been very effective in predicting climate change, but have not been as effective in predicting its impact on ecosystem[s] and human society. The distinction between the two has not been stated clearly. For this reason, major effort should be made to monitor globally not only climate change, but also its impact on ecosystem[s] through remote sensing from satellites as well as in-situ observation.
This line plot shows yearly temperature anomalies from 1880 to 2014 as recorded by NASA, NOAA, the Japan Meteorological Agency, and the Met Office Hadley Centre (United Kingdom). Though there are minor variations from year to year, all four records show peaks and valleys in sync with each other. All show rapid warming in the past few decades, and all show the last decade as the warmest. Read more about this chart.
At the time of his death on December 23, 2016, Piers Sellers was the deputy director of the Sciences and Exploration Directorate and the acting director of the Earth science division at NASA’s Goddard Space Flight Center. But he was a lot more than that to his colleagues and to the world. NASA science writer Patrick Lynch (and occasional Earth Observatory contributor) had the unenviable task of trying to capture the essence of Sellers:
“As an astronaut, he helped build the International Space Station. As a manager, he helped lead hundreds of scientists. And as a public figure he was an inspiration to many for his optimistic take on humanity’s ability to confront Earth’s changing climate. But his most lasting contributions will be in the field where he began his career: science.”
Piers came to NASA Goddard from Britain in 1982 as a research meteorologist and climate scientist. His focus was the interaction between the biosphere — the living, breathing plant-life of Earth — and the atmosphere. He helped develop models and wrote several papers that are still widely cited in the field. But he also had another lifelong dream: to become an astronaut. He applied to the astronaut training program in the 1990s, and worked through rigorous screening and training to go into space. He flew to the International Space Station in 2002, 2006, and 2010, participating in six spacewalks and helping with assembly of the station. Upon retiring from the astronaut corps, he came back to Goddard and resumed his place as a leader in Earth science, while also promoting conversations and collaborations with researchers studying planetary science and hunting for life beyond our solar system.
I did not have the chance to get to know Piers well. He was someone I mostly watched from afar and our interactions were sporadic, though always interesting, dignified, and thoughtful. I came to know him mostly through his words — to the media and to my fellow scientific and communications staff of NASA Goddard — and in the ways he inspired people. The more I read, the more I wish I had been able to spend more time with him.
In January 2016, one year ago this week, he wrote a poignant op-ed in The New York Times. The words were a compelling prelude to his final year with us.
I’m a climate scientist who has just been told I have Stage 4 pancreatic cancer. This diagnosis puts me in an interesting position. I’ve spent much of my professional life thinking about the science of climate change, which is best viewed through a multi-decadal lens. At some level I was sure that, even at my present age of 60, I would live to see the most critical part of the problem, and its possible solutions, play out in my lifetime. Now that my personal horizon has been steeply foreshortened, I was forced to decide how to spend my remaining time. Was continuing to think about climate change worth the bother?
In the summer of 2016, Sellers wrote another compelling piece, this time in The New Yorker. In “Space, Climate Change, and the Real Meaning of Theory,” he took on a very sensitive and fundamental facet of science: the accumulation of evidence and observation that leads to truth. Here is my favorite passage:
When we talk about why the climate has changed, and what the future climate is likely to be, scientists use analyses and predictions that rest heavily on results from computer models, which in turn rest on layers and layers of theory. And there’s the rub—a lot of the confusion about what is known and unknown about the changing climate can be traced to people’s understanding of the role of theory in science.
Fundamentally, a theory in science is not just a whim or an opinion; it is a logical construct of how we think something works, generally agreed upon by scientists and always in agreement with the available observations. A good example is Isaac Newton’s theory of gravitation, which says that every physical object in the universe exerts a gravity force field around itself, with the strength of that field depending on its mass. The theory—one simple equation—does a superb job of explaining our observations of how planets orbit around the sun, and was more than good enough to make the calculations we needed to send spacecraft to the moon and elsewhere. Einstein improved on Newton’s theory when it comes to large-scale astronomical phenomena, but, for everyday engineering use, Newton’s physics works perfectly well, even though it is more than three hundred years old.
…Engineering theory, based on Newton’s work, is so accepted and reliable that we can get it right the first time, almost every time. The theory of aerodynamics is another perfect example: the Boeing 747 jumbo-jet prototype flew the first time it took to the air—that’s confidence for you. So every time you get on a commercial aircraft, you are entrusting your life to a set of equations, albeit supported by a lot of experimental observations. A jetliner is just aluminum wrapped around a theory.
On camera, it is easy to pick up the energy, humor, and dignity of the man. In the past year, he was a frequent interview subject for the television and radio media. He also made an appearance in Leonardo DiCaprio’s documentary Before the Flood. Some of us think Piers stole the show.
But I am most fond of this simple video, posted last month to YouTube. It’s a conversation between Piers and Compton Tucker, one of his best friends, his next-door neighbor, and a fellow NASA scientist. So many people have stilted and distant impressions about scientists, and Hollywood caricatures don’t help. I like this video because it shows bright people having fun, being human, and savoring life, learning, and friendship.
At the conclusion of his January 2016 piece in The New York Times, Piers offers a thought that inspires us to keep up the good work.
As for me, I’ve no complaints. I’m very grateful for the experiences I’ve had on this planet. As an astronaut I spacewalked 220 miles above the Earth. Floating alongside the International Space Station, I watched hurricanes cartwheel across oceans, the Amazon snake its way to the sea through a brilliant green carpet of forest, and gigantic nighttime thunderstorms flash and flare for hundreds of miles along the Equator. From this God’s-eye-view, I saw how fragile and infinitely precious the Earth is. I’m hopeful for its future.
View of the dais during the High-Level Segment Ministerial Round Table ‘Towards an Agreement on the HFC Amendment under the Montreal Protocol – Part 2: Ensuring Benefits for All.’ October 14, 2016 (day 6). Photo by IISD/ENB | Kiara Worth
Now the scientists and negotiators behind the Protocol are taking on a new problem: the climate warming effects of chemicals that were supposed to be better for the ozone layer. NASA scientist Paul Newman attended the Montreal Protocol’s international meeting this October in Kigali, Rwanda, and he sat down with us to explain the new agreement, why it’s unique, and what it was like to participate in the meeting. Here are some of the main takeaways:
The Montreal Protocol is not just about ozone depleting substances; it’s also about their replacements.
“The Montreal Protocol is written so that we can control ozone-depleting substances and their replacements. Chlorofluorocarbons (CFCs) were initially replaced with hydrochlorofluorocarbons (HCFCs) and then hydrofluorocarbons (HFCs), making HFCs the so-called “grandchildren” of the Montreal Protocol.
HFCs very weakly affect the ozone layer. But the problem is that they are powerful greenhouse gases. One can of this keyboard cleaner [an aerosol can with HFC-134a] is equivalent to 1,360 cans of carbon dioxide. HCFs are much more powerful than carbon dioxide as a greenhouse gas, and that’s true of many HFCs, not just HFC-134a. And their use—particularly in refrigeration and air conditioning—has been going up fast.
It has been projected that by 2100, the effect of HFCs on temperature could be as high as 0.5 Kelvin (0.5 degrees Celsius, or 0.9 degrees Fahrenheit) if we did nothing. Because of the amendment, that number will be closer to 0.06 Kelvin (0.06 degrees Celsius, or 0.11 degrees Fahrenheit).
The point of the of Kigali amendment was to control these greenhouse gases because they are a replacement for CFCs. They are adding to the climate problem, so the world’s nations wanted to do something about it. The Montreal Protocol has evolved from strictly an ozone treaty, to an ozone and climate treaty.”
The amendment looks into the future because the accumulation and removal of HFCs in the atmosphere takes a while.
“HFCs go through a series of steps before they can begin accumulating in the atmosphere. The first is production, in which factories make tanks of the gas (like a brewery making big vats of beer). The next step is consumption, when HFCs are added to things like refrigerators and air conditioners (to follow the beer analogy, that’s when the brewer puts the beer in a bottle or keg). Finally, HFCs are emitted when people use those things (pop the cork on the bottle or tap the keg).
The important point is that there is a time lag. Consumption of HFCs are projected to peak in the late 2020s, but emissions don’t peak until about 2035. Once in the atmosphere, HFCs last a long time before being destroyed by chemical reactions. For example, If I vent my can of 134a, 5 percent of it will still be in the atmosphere after 42 years. So they continue to accumulate and peak in the atmosphere by the mid 2050s.”
“Montreal Protocol meetings don’t have a schedule; they have an agenda. That means that we have a list of topics and we work until they are all addressed. For example, we worked all day Friday (October 14, 2016) but didn’t finish, so we reconvened Saturday at 1 a.m. and finished the amendment at 7 a.m. I was up for 27 straight hours, tired and hungry.”
… but uniquely effective.
“This agreement is a huge step forward because it is essentially the first real climate mitigation treaty that has bite to it. It has strict obligations for bringing down HFCs, and is forcing scientists and engineers to look for alternatives.
The Montreal Protocol is also technically the perfect mechanism for dealing with these issues. The technology people, economics people, science people, chemical manufacturers—they have all worked through the Montreal Protocol and are fully capable of dealing with refrigerants like HFCs and their alternatives.
The agreement wouldn’t go forward without a pretty good idea about what those replacements might be. Hydrofluoroolefins, for example, have a really tiny climate impact and only 10-day lifetimes and are already being used in some applications. The Montreal Protocol is pro-engineering, pro-technology, and very different than any other treaty. We can solve our environmental problems—that is the power of a technological society.”