Some features of this site are not compatible with your browser. Install Opera Mini to better experience this site.

Earth Matters

The New UN Climate Report in One Sentence

October 19th, 2018 by Adam Voiland


In October 2018, the Intergovernmental Panel on Climate Change (IPCC) released yet another sobering report about the planetary disruption happening because of all the carbon human activity puts into the atmosphere.

Many parts of the world are already seeing rising sea levels, hotter temperatures, more extreme precipitation and droughts, more acidic oceans, and faster rates of extinctions, the scientists said. And without dramatic reductions in carbon emissions to keep warming below 1.5 degrees Celsius, the problems are going to get far worse.

In just one sentence, Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies, captured the essence of the report. “The key thing to remember is that it’s clear that the best time to have reduced emissions was 25 years ago,” he said during an interview with PBS News Hour. “But the second best time to reduce emissions is right now.”

Like any good scientist, Schmidt is always quick to give credit where credit is due. In this case, he noted that what he said on television was a riff on something that renowned Kenyan marathoner Eliud Kipchoge once said. “The best time to plant a tree was 25 years ago. The second-best time to plant a tree is today,” The New York Times quoted the marathon world record holder as saying.

If you are wondering what the IPCC authors based their findings on, there is no shortage of information to explore. Each chapter of the report has a supplementary information section with dozens of references that detail the evidence the scientists used to draw their conclusions.

For more details, the IPCC also put together a succinct summary for policy makers, a press release, headline statements, and FAQs. Also, below are a few other good resources if you’re looking to understand some of the basics about climate change.

+NASA’s Global Climate Change Evidence Page
+NASA Earth Observatory’s Global Temperatures World of Change
+NOAA Climate Education Resources
+Skeptical Science Climate Change Myths


According to an ongoing temperature analysis conducted by scientists at NASA’s Goddard Institute for Space Studies, the average global temperature on Earth has increased by about 0.8° Celsius (1.4° Fahrenheit) since 1880. Two-thirds of the warming has occurred since 1975, at a rate of roughly 0.15-0.20°C per decade. Read more about this map here.


If there was ever a satellite that deserves an award for longevity, it’s Terra. Designed for a mission of 6 years (or 30,000 orbits), the bus-sized spacecraft continues to cruise 705 kilometers (438 miles) above Earth’s surface nearly 19 years after launch. The spacecraft officially surpassed 100,000 orbits on October 6, 2018. To celebrate, here are ten things to know about the intrepid Earth-observing satellite. Click on each image to find out more.

1. Terra had to be designed from scratch. Unlike many of the smaller satellites that preceded it, engineers couldn’t riff off of an existing design.


2. The bus-sized spacecraft carries five scientific sensors — MODIS, MOPITT, MISR, ASTER, and CERES. All of them continue to send back useful data.


3. The MODIS sensor captures stunning images of hurricanes, wildfires, volcanoes, dust storms, oil spills, and other hazards.


4. Using MOPITT, atmospheric scientists have tracked global trends in carbon monoxide for nearly two decades. The good news: concentrations of the toxic air pollutant are declining.


5. Likewise, they have used the CERES sensor to measure whether Earth’s reflectivity—or albedo—has changed. Despite some fluctuations, there does not appear to be a trend.


6. The MISR sensor can detect the height of volcanic plumes, smoke plumes, dust plumes, and other aerosols. This is key to understanding where plumes will go and whether they will pose a threat to people on the ground.


7. Terra orbits 705 kilometers (438 miles) above the surface, about the distance between Boston, MA, and Washington, D.C.


8. A widely used global digital elevation map is based on data collected by ASTER.


9. Experts use Terra MODIS observations of NDVI — a measure of the greenness of plants — to help anticipate food shortages.


10. While in space, Terra has traveled the equivalent of 2.5 billion miles — nearly the distance to Neptune.

First Photons from ICESat-2

October 5th, 2018 by Kathryn Hansen

“Pho,” a cartoon character representing a photon of light from ICESat-2, illustrates how the science mission works. Watch the video here. Credit: NASA/Goddard/Savannah College of Art and Design et al.

The successful launch of a satellite is an exciting step for the scientists and engineers who have spent years dedicated to a mission. But there are still many more boxes to be checked, and anticipation builds as a satellite’s instruments are turned on and they produce what scientists call “first light” — the first time a satellite opens its “eyes” and delivers preliminary images or data.

After the successful launch of NASA’s Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) on September 15, 2018, “first light” was not a natural-color image of Earth like those that come from satellites such as Terra, Aqua, or Landsat. Rather, the satellite’s sole instrument—the Advanced Topographic Laser Altimeter System (ATLAS)—acquired measurements of surface elevation. The laser fired for the first time on September 30, and returned its first height measurements from across the Antarctic ice sheet on October 3.

This visualization of ICESat-2 data shows the first set of height measurements from the satellite as it orbited over the Antarctic ice sheet. Credit: NASA’s Goddard Space Flight Center

This elevation measurement, acquired on October 3, shows the height of the Antarctic ice sheet along a path starting in East Antarctica, passing close to the South Pole, and ending in West Antarctica. As explained in a NASA story about the measurement:

“When scientists analyze the preliminary ICESat-2 data, they examine what is called a “photon cloud,” or a plot of each photon that ATLAS detects. Many of the points on a photon cloud are from background photons — natural sunlight reflected off Earth in the exact same wavelength as the laser photons. But with the help of computer programs that analyze the data, scientists can extract the signal from the noise and identify height of the ground below.”

These charts show the first photon returns from the instrument’s six beams as the satellite orbited over Antarctica. The green lines represent the number of photons detected. The X axis indicates the amount of time it took the photons to get from the satellite to the ground and back again. Credit: Megan Bock/NASA’s Goddard Space Flight Center.

Donya Douglas-Bradshaw, the project manager for the ATLAS instrument, said in the NASA story:

“We were all waiting with bated breath for the lasers to turn on and to see those first photons return. Seeing everything work together in concert is incredibly exciting. There are a lot of moving parts and this is the demonstration that it’s all working together.”

New Tools to Boost Access to NASA Earth Science Data

October 1st, 2018 by Mike Carlowicz

NASA has funded five new projects to develop tools and technology to make the agency’s massive Earth science datasets more accessible and user-friendly.

Wake up. Turn on laptop. Start processing airborne data of the Adirondack forests in New York. Make Coffee. Eat Breakfast. Fasten the open laptop’s seatbelt in the passenger seat as it continues to crunch numbers. Drive to work.

NASA Earth science datasets provide different perspectives and information on our planet, as seen here in this data visualization of observations of Hurricane Matthew in October 2016. Credits: NASA’s Scientific Visualization Studio

That used to be Sara Lubkin’s morning routine as an early career scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Once at work, she would use her desktop computer, while her laptop diligently spent the next 12 hours processing airborne instrument data for the relevant information she needed to study invasive pests of hemlock trees.

“I’m not a computer scientist, I’m an Earth scientist,” said Lubkin, who now works as a program officer for NASA Earth Science Data Systems’ Advancing Collaborative Connections for Earth Systems Science, or ACCESS program. But her experience as a researcher is not unique.

Spending large chunks of time simply getting Earth science data into a usable form for analysis is a common situation for researchers working with the big datasets that come from NASA field, airborne and satellite missions. Downloading huge files, converting data formats, locating the same study areas in multiple datasets, writing code to distinguish different land types in a satellite image – these types of tasks eat into time scientists would rather be using to analyze the actual information in the data.

That’s where the ACCESS program comes in. Part of the Earth Science Data Systems division since 2005, ACCESS finds innovative ways to streamline that cumbersome processing time. The program funds two-year research projects to improve behind-the-scenes data management and provide ready-to-use datasets and services to scientists, Lubkin said.

Sara Lubkin worked with NASA’s big data sets studying invasive pests in Adirondack hemlock trees as part of NASA’s Applied Sciences DEVELOP program, which addresses environmental and public policy issues through interdisciplinary research projects that apply the lens of NASA Earth observations to community concerns around the globe. Credit: Sara Lubkin

In June, NASA selected five teams of NASA, university and commercial computer science researchers from the 2017 round of submissions in a range of projects that will use machine learning, cloud computing and advanced search capabilities to develop tools to improve the behind-the-scenes management for selected NASA datasets.

“We continually invest in development and evaluation of the newest technologies to improve science data systems,” said Kevin Murphy, program executive for NASA’s Earth Science Data Systems at NASA Headquarters in Washington. But more than that, they want to make sure that the tools and technology help real scientists address real problems.

Each ACCESS project has Earth scientists and computer scientists involved from beginning to end, Murphy said. “With the ACCESS program, we’re really trying to understand, for example, how ocean currents work, but we’re trying to do that now with data that’s so large that we need a team of experts who can work together to solve the big science and big data questions.”

The projects will complement data management, distribution and other services provided by the Earth Observing System Data and Information System (EOSDIS), which manages and stores NASA data collected from Earth-observing satellites, aircraft and field campaigns. EOSDIS has 12 interconnected data and archive centers located across the United States, which are organized by discipline. Currently, these centers host 26 petabytes of Earth datasets – that’s 26 million gigabytes, or enough data to need 52,000 computers each with 500 gigabytes of storage space. That number is expected to grow to 150 petabytes within five years with the launch of new satellites.

“Satellite data is big data,” said Jeff Walter, one of the ACCESS 2017 principal investigators and lead engineer for Science Data Services at the Atmospheric Science Data Center at NASA’s Langley Research Center in Hampton, Virginia. “It’s very complex and sometimes difficult to use, even for expert users. In addition to the volume, which makes it difficult for users to acquire, store and manage, there’s also the complexity of both the format and content. Users often have to spend a lot of time understanding how the data is organized and what the various parameters represent.”

Walter’s project is one of three that will use cloud computing to alleviate download and storage issues for users. Starting with two atmospheric datasets, his team will also be developing a way to convert satellite data formats into those that can be read by commercial geospatial information system (GIS) software.

“Our project aims to lower the barrier to entry for a potential new user community who might find novel ways to use this data, and who are more familiar with GIS types of tools,” Walter said.

The two other cloud computing projects will be developing open source processing and analysis tools, including one designed for ocean datasets. A fourth project will use machine learning to detect changes over time in land observations, starting with the detection of landslides, floods and uplift caused by volcanic activity. The fifth project will develop an automated method for lining up datasets that observe the same location so researchers can combine more than one type of information about a place.

NASA has 26 Earth-observing satellites monitoring the vital signs of our home planet. Along with airborne and ground Earth science missions, their data is stored and managed by the Earth Observing System Data and Information System. Credit: NASA

Upon completion, the ACCESS researchers will work closely with EOSDIS teams to incorporate their advancements into the data centers’ day-to-day operations. Once those new tools are in place, that’s when the real power of open and freely available Earth science datasets can flourish, according to Murphy. Easy-to-use data means it gets into the hands of decision-makers, non-governmental organizations, scientists studying related applications and researchers in different fields that may have new uses for it.

“When you make these products open and accessible, you have a lot of unintended, good scientific consequences,” Murphy said, citing examples that include detecting groundwater movement from space, rapid wildfire detection and using night lights to study human energy use. “NASA has a lot of very valuable information, and the ACCESS program really tries to help scientists to not only address primary science questions but also help us understand our environment and plan for our future.”

To learn more about ACCESS, visit

To learn more about NASA’s Earth Science Data Systems, visit

Walking on Venezuela’s Last Glacier

September 27th, 2018 by Kathryn Hansen

The retreat of Humboldt Glacier—Venezuela’s last patch of perennial ice—means that the country could soon be glacier-free. We featured the glacier in August 2018 as an Image of the Day showing how it changed between 1988 and 2015.

Satellite images can tell you a lot about a glacier, but direct measurements by people on the ground provide a unique, important perspective, especially for glaciers as small as Humboldt. Carsten Braun, a scientist at Westfield State University, last surveyed the glacier in 2015. He talked about what it was like to stand on Venezuela’s last glacier.

The Operational Land Imager (OLI) on Landsat 8 acquired this natural-color image of the glacier on January 6, 2015.

These photographs show the ground-based view of Humboldt Glacier in 2015. Photos by Carsten Braun.

What things were you measuring during the 2015 survey?

This was a very ‘low-tech’ trip: just me and a guide. The approach to the glacier takes two days of very rough hiking with big packs. That gets you to base camp at Laguna Verde below the glacier.

To get to the glacier takes another three hours on rough terrain. This is definitely popular with climbers. They cross the Humboldt Glacier and then summit Pico Humboldt.

I did the same thing in 2009 and 2011: I walked around the glacier right on the margin with a simple GPS receiver to make a map of it. That sounds a lot easier than it is. Walking right on the edge of a glacier can be difficult and dangerous. This was definitely both!

What was the ice like? Parts of it look like snow or slush.

This glacier is little different than what you may have experienced. It is tiny and does not have an accumulation area. The surface is 100 percent ice everywhere, just covered in some parts in wet seasonal snow that will melt away. Basically that means that the Humboldt Glacier has no means to ‘add’ mass (‘eat’) and continuously loses mass everywhere (‘fasting’). Obviously, that’s not sustainable.


Photos by Carsten Braun.

Do you remember what you were thinking while hiking on the country’s last glacier?

I was definitely considering the impacts of losing this glacier. It has little ‘practical use’ today, as it is so small and pretty much irrelevant for water supply. Its disappearance would not impact water resources much, if at all. That’s much in contrast with countries like Peru and Bolivia, where glacier recession already creates huge problems for water resources, hydro-power, etc.

The impact in Venezuela is more at a spiritual level. The mountain chain is was named Sierra Nevada de Mérida (snowy mountain range of Mérida) because of its glacier cover. Now it will be gone soon and may never come back again. (Well, that’s up to us humans to decide.) And with that, the reality of these mountains will change. The lack of glaciers will be the ‘new normal.’ It’s a little bit like losing a species: once it’s gone, you never realize that it is missing.


September Puzzler

September 26th, 2018 by Kathryn Hansen

Every month on Earth Matters, we offer a puzzling satellite image. The September 2018 puzzler is above. Your challenge is to use the comments section to tell us what we are looking at and why this place is interesting.

How to answer. You can use a few words or several paragraphs. You might simply tell us the location. Or you can dig deeper and explain what satellite and instrument produced the image, what spectral bands were used to create it, or what is compelling about some obscure feature in the image. If you think something is interesting or noteworthy, tell us about it.

The prize. We can’t offer prize money or a trip to Mars, but we can promise you credit and glory. Well, maybe just credit. Roughly one week after a puzzler image appears on this blog, we will post an annotated and captioned version as our Image of the Day. After we post the answer, we will acknowledge the first person to correctly identify the image at the bottom of this blog post. We also may recognize readers who offer the most interesting tidbits of information about the geological, meteorological, or human processes that have shaped the landscape. Please include your preferred name or alias with your comment. If you work for or attend an institution that you would like to recognize, please mention that as well.

Recent winners. If you’ve won the puzzler in the past few months or if you work in geospatial imaging, please hold your answer for at least a day to give less experienced readers a chance to play.

Releasing Comments. Savvy readers have solved some puzzlers after a few minutes. To give more people a chance to play, we may wait between 24 to 48 hours before posting comments.

Good luck!

Responding to Hurricane Florence with NASA Data

September 21st, 2018 by Kasha Patel

Early on September 12, 2018, astronaut Alex Gerst shot this photograph of Florence’s eye as viewed from the International Space Station. He tweeted: “Ever stared down the gaping eye of a category 4 hurricane? It’s chilling, even from space.” Credit: ISS Photograph by Alex Gerst, European Space Agency/NASA

When Hurricane Florence approached the Carolinas, the NASA Disasters Program began providing a suite of satellite data products to disaster responders, such as the Federal Emergency Management Agency (FEMA) and the National Guard. The goal was to provide the latest information for decision-making on everything from evacuations to supply routes to recovery estimates.

Andrew Molthan is a research meteorologist at NASA’s Marshall Space Flight Center who serves as a “disaster coordinator” for the disasters program. This week, he has been sitting at the FEMA National Response Coordination Center in Washington, D.C., to facilitate coordination of NASA data. We asked him a few questions to better understand the NASA Disaster Program’s role during Hurricane Florence.

What is your role at FEMA this week?

I am here at FEMA to better understand the agency’s geospatial needs during a major disaster, to help improve coordination, and to lend additional remote sensing and/or meteorological expertise where I can.

I am also helping with coordination and data exploitation for the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) instrument aboard NASA’s C-20A aircraft, operated by a pilot. As a team, colleagues from NASA centers all over the country—Marshall, Headquarters, Jet Propulsion Laboratory, Armstrong Flight Research Center, Goddard Space Flight Center, and Langley Research Center—are working collaboratively to help target the UAVSAR instrument for daily radar imaging over the most critical rivers of interest to FEMA, the National Guard, and other partners. Scientists are assisting agencies in the interpretation of the UAVSAR imagery to inform immediate response efforts. They will also further process the data and use it as part of longer-term efforts to improve flood remote sensing and improve streamflow and inundation models.

Above: Civil Air Patrol photo taken on September 18, 2018 near Cheraw, SC. Credit: Civil Air Patrol

Above: UAVSAR polarimetric decomposition image taken on September 17, 2018 near Cheraw, SC (same area as Civil Air Patrol photo above). Pink denotes urban areas whereas red/orange denotes inundated forests. Dark blue or black are flooded open water; roads can be black even if not flooded. Green, yellow, and light blue color denote areas that are not flooded. Note: Red — Double bounce scattering (flooded forests and urban); Green – Volume scattering (unflooded forests); Blue – specular scatters (dry bare ground, open water). Credit: Yunling Lou/JPL, Bruce Chapman/JPL and Gerald Bawden/HQ

What NASA products are being shared with the National Guard and FEMA?

Most of our activities have focused on helping with the remote sensing of flooded areas following the heavy rains associated with Hurricane Florence. Many river basins in southern Virginia, central and eastern North Carolina, and northeastern South Carolina have experienced widespread river flooding and flash flooding that has affected citizens and need to be monitored for response efforts.

Above: This GPM IMERG visualization shows storm-total accumulated rainfall on the left for 9/12/18 – 9/17/18 vs. a sequence of 3-hour accumulations on the right. Credit: NASA

NASA Marshall team members are producing products and assisting with event coordination including my spot here at FEMA supporting their geospatial team. Scientists with the Jet Propulsion Laboratory (the ARIA team) are routinely generating flood- and damage proxy maps. Goddard researchers are assisting with optical and radar flood detections. The Langley Research Center is assisting with data access and sharing via GIS platforms. NASA Headquarters is supporting overall agency coordination. Johnson Space Center is helping to acquire dramatic footage of the storm and aftermath from astronaut photography.

What instruments are being used?

The extensive cloud cover from the storm has blocked surface views from instruments operating in the visible, near infrared, and thermal wavelengths, so synthetic aperture radar (SAR) information has been critical. SAR has the ability to “see” through clouds, making it an all-weather instrument. These include images from the European Space Agency’s Sentinel-1A/1B platforms, international and commercial partner assets, such as those from the Japan Aerospace Exploration Agency’s ALOS-2, Canadian Space Agency’s Radarsat-2, and the German TerraSARx, which are made available through government partnerships and the International Charter on Space and Major Disasters.

This flood proxy map shows the extent of flooding 36 hours after the hurricane’s landfall (September 15, 2018 18:57 PM local time). The map is derived from Synthetic Aperture Radar (SAR) data from the Copernicus Sentinel-1 satellites, operated by the European Space Agency (ESA).

As skies are now beginning to clear, we’ll also look for opportunities to use other NASA satellite remote sensing assets — including Terra/Aqua MODIS, Suomi-NPP VIIRS, Landsat 8 — and applications to identify water on the surface. We’ll also take a look at nighttime light imaging from Suomi-NPP VIIRS and the day-night band, using the NASA Black Marble and Black Marble HD products generated at Goddard.

Above: The VIIRS instrument on the joint NASA/NOAA Suomi NPP satellite observed Hurricane Florence as it developed in the Atlantic Ocean and made landfall in North Carolina on Sept. 14, 2018. Credits: NASA Worldview


Help Make a Better World Land Map with NASA App

September 13th, 2018 by Kathryn Hansen


Starting this month, you can be part of a project to create more detailed satellite-based global maps of land cover by sharing photos of the world around you in a new NASA citizen science project.

The project is a part of GLOBE Observer, a citizen science program that lets you contribute meaningful data to NASA and the science community. The GLOBE Observer app, introduced in 2016, includes a new “Land Cover: Adopt a Pixel” module that enables citizen scientists to photograph with their smartphones the landscape, identify the kinds of land cover they see (trees, grass, etc.), and then match their observations to satellite data. Users can also share their knowledge of the land and how it has changed.

“Adopt a Pixel” is designed to fill in details of the landscape that are too small for global land-mapping satellites to see.

“Even though land cover is familiar to everyone on the planet, the most detailed satellite-based maps of global land cover are still on the order of hundreds of meters per pixel. That means that a park in a city may be too small to show up on the global map,” says Peder Nelson, a land cover scientist at Oregon State University.

Holli Kohl, coordinator for the project says: “Citizen scientists will be contributing photographs focused on a 50-meter area in each direction, adding observations of an area up to about the size of a soccer field. This information is important because land cover is critical to many different processes on Earth and contributes to a community’s vulnerability to disasters like fire, floods or landslides.”

To kickstart the data collection, GLOBE Observer is challenging citizen scientists to map as much land as possible between Sept. 22, Public Lands Day, and Oct. 1, NASA’s 60th anniversary. The 10 citizen scientists who map the most land in this period will be recognized on social media and will receive a certificate of appreciation from GLOBE Observer.

The free GLOBE Observer app is available from Google Play or the App Store. Once you download the app, register, and open the Land Cover module, an interactive tutorial will teach you how to make land cover observations.

“We created GLOBE Observer Land Cover to be easy to use,” says Kohl. “You can simply take photos with your smartphone, submit them, and be done, if you like. But if you want to take it a step further, you can also classify the landscape in your photo and match it to satellite data.”

Scientists like Nelson anticipate using the photographs and associated information to contribute to more detailed land cover maps of Earth.

“Some parts of the world do have high spatial resolution maps of land cover, but these maps do not exist for every place, and the maps are not always comparable. Efforts like GLOBE Observer Land Cover can fill in local gaps, and contribute to consistent global maps,” says Allison Leidner, GLOBE program manager at NASA Headquarters in Washington.

Changes in land cover matter because land cover can alter temperatures and rainfall patterns. Land cover influences the way water flows or is absorbed, potentially leading to floods or landslides. Some types of land cover absorb carbon from the atmosphere, and when subject to changes, such as a forest burned in a wildfire, result in more carbon entering the atmosphere. Improved land cover maps will provide a better baseline to study all of these factors at both global and local scales, particularly as scientists integrate improved land cover maps into global models.

The data aren’t just for scientists. “Everyone will have access to this data to understand local change,” says Nelson. Citizen scientists who participate will be creating their own local land cover map. The data will be available to anyone through the GLOBE web site.

To learn more, follow GLOBE Observer on Facebook @nasa.globeobserver or Twitter @NASAGO or visit the GLOBE Observer web site. Read the full story here.

August Puzzler

August 28th, 2018 by Kathryn Hansen

Every month on Earth Matters, we offer a puzzling satellite image. The August 2018 puzzler is above. Your challenge is to use the comments section to tell us what we are looking at and why this place is interesting.

How to answer. You can use a few words or several paragraphs. You might simply tell us the location. Or you can dig deeper and explain what satellite and instrument produced the image, what spectral bands were used to create it, or what is compelling about some obscure feature in the image. If you think something is interesting or noteworthy, tell us about it.

The prize. We can’t offer prize money or a trip to Mars, but we can promise you credit and glory. Well, maybe just credit. Roughly one week after a puzzler image appears on this blog, we will post an annotated and captioned version as our Image of the Day. After we post the answer, we will acknowledge the first person to correctly identify the image at the bottom of this blog post. We also may recognize readers who offer the most interesting tidbits of information about the geological, meteorological, or human processes that have shaped the landscape. Please include your preferred name or alias with your comment. If you work for or attend an institution that you would like to recognize, please mention that as well.

Recent winners. If you’ve won the puzzler in the past few months or if you work in geospatial imaging, please hold your answer for at least a day to give less experienced readers a chance to play.

Releasing Comments. Savvy readers have solved some puzzlers after a few minutes. To give more people a chance to play, we may wait between 24 to 48 hours before posting comments.

Good luck!

Answer: About two billion years ago, an asteroid hit Earth southwest of what is now Johannesburg, South Africa, and formed Vredefort Crater—the world’s oldest and largest known impact structure. Layers of upturned rock eroded at different rates and produced the concentric pattern visible in the image above. Congratulations to Felix Bossert and Tom for correctly identifying the feature. Read more about the image in our September 1, 2018, Image of the Day.

Karenia brevis cells. Image credit: Mote Marine Laboratory

Put a sample of water from the Gulf of Mexico under a microscope, and you will often find cells of Karenia brevis swimming around. The microscopic algae—the species of phytoplankton responsible for Florida’s worst red tide outbreaks—produce brevetoxin, a compound that in high concentrations can kill wildlife and cause neurological, respiratory, and gastrointestinal issues for people.

Under normal conditions, water quality tests find, at most, a few hundred K. brevis cells per liter of water—not enough to cause problems. But in August 2018, in the midst of one of the most severe red tide outbreaks to hit Florida’s Gulf Coast in a decade, water samples regularly contained more than one million K. brevis cells per liter.

Natural-color MODIS satellite image of algae staining the water off the coast of Fort Myers on August 19, 2018. Image from the University of South Florida Near Real-Time Integrated Red tide Information System (IRIS).

That was enough to stain large swaths of coastal waters shades of green and brownish-red and leave beaches littered with rotting fish carcasses. Roughly 100 manatees, more than 200 sea turtles, and at least 12 dolphins have been killled by red tides, according to preliminary estimates. For much of August, the toxic bloom stretched about 130 miles (200 kilometers) along Florida’s Gulf coast, from roughly Tampa to Fort Myers. Though the bloom has been active since October 2017, it intensified rapidly in July 2018. The damage grew so severe and widespread that Florida’s governor declared a state of emergency in mid-August.

One of the best ways to test for the presence of K. brevis is to analyze water samples collected from boats or beaches. State environmental agencies do this on a regular basis, but understanding the full extent and evolution of fast-changing blooms, or predicting where they will move with ground sampling alone is a challenge.

Sampling red tide in 2018 (left). An aerial view of red tide in 2005 (right). Photo credits: Florida Fish and Wildlife Conservation Commission.

That’s why key red tide monitoring systems, such as the National Oceanic and Atmospheric Administration’s (NOAA’s) Harmful Algal Bloom Forecast System and the Near Real-Time Integrated Red Tide Information System (IRIS) from the University of South Florida, make use of satellite data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on NASA’s Aqua and Terra satellites. These sensors pass over Florida’s Gulf Coast twice a day, acquiring data at several wavelengths that can be useful for identifying and mapping the spatial extent of algal blooms. Other satellite sensors such as the Visible Infrared Imaging Radiometer Suite (VIIRS) on Suomi NPP and the Ocean and Land Color Instrument (OCLI) on Sentinel-3 collect information that can be used to monitor red tides as well.

A screenshot from the University of South Florida’s Near Real-Time Integrated Red tide Information System (IRIS). The image shows various types of data captured by MODIS sensor on Terra on August 19, 2018. Solar-stimulated fluorescence data (NFLH) is particularly useful for locating algal blooms. Image Credit: USF/IRIS

Despite the utility of satellite observations, there are some significant challenges to interpreting satellite data of algal blooms in shallow, coastal waters, explained oceanographer Chuanmin Hu of the University of South Florida. Chief among them: it can be quite difficult to distinguish between algal blooms, suspended sediment, and colored dissolved organic matter (CDOM) that flows into coastal areas.

The Karenia brevis bloom expanded and intensified in late-July. This fluorescence data comes from the OCLI sensor on Sentinel 3. Image courtesy of Rick Stumpf, NOAA.

To get around this problem and make satellites better at pinpointing algal blooms, Hu and colleagues at the University of South Florida have developed a red tide monitoring system that makes use of MODIS observations of fluorescence, which algal bloom emit in response to exposure to sunlight. “If we have fluorescence data to go along with a natural-color image from MODIS, we can say with a high degree of confidence where the algal blooms are and where the sensor is just detecting sediment or CDOM,” he said. When fluorescence data is available, the Florida Fish and Wildlife Commission pushes it out to the public as part of its red tide status updates (see the August 21 update below).

Likewise, NOAA has combined a fluorescence method with a long-standing technique that identifies recent increases in chlorophyll concentration, the combination improves the identification of likely K. brevis blooms — information that then gets incorporated in NOAA’s HAB Forecast System, noted Richard Stumpf, an oceanographer with NOAA.

K. brevis cell abundance shown on an ocean color satellite image from the IRIS system. Warmer colors indicate higher levels of chlorophyll a, an indicator of algae. Cloudy areas are gray. Circles indicate locations where officials tested water samples on the ground. Image Credit: FFW/USF/IRIS.

However, that still leaves some big problems—only about ten percent of MODIS passes collect usable fluorescence data. The rest of the time images are marred by either sunglint or clouds. And the algorithm that scientists use to detect algal blooms with MODIS does not work well within one kilometer of the coast—the part that is of the greatest interest to beachgoers and boaters.

The Terra satellite. Image Credit: NASA

To help fill in the gaps, NASA’s Applied Science program is working with several partner institutions on a smartphone app called HABscope. The app, developed by Gulf of Mexico Observing System (GCOOS) researcher Robert Currier, makes it possible for trained water samplers (typically lifeguards who participate in Mote Marine Laboratory’s Beach Conditions Reporting System) to collect video of water using microscopes attached to their smartphones.

After recording, HABscope uploads videos to a cloud-based server for automatic analysis by computer software. The software rapidly counts the number of K. brevis cells in a water sample by using technology similar to that found in facial recognition apps. But rather than focusing on facial features, the software looks for a particular pattern in the movement of K. brevis cells.

K. brevis are vigorous swimmers, often using a pair of long, whip-like flagella to migrate vertically about 10 to 20 meters (33 to 66  feet) each day. They chart a zig-zagging, corkscrew-shaped path that allows the software to easily pick them out amidst the cast of other phytoplankton found in Gulf of Mexico water samples.

The data about K. brevis abundance at various locations along the coast is then fed into a respiratory distress forecasting tool managed by NOAA. “Respiratory distress forecasts can now be produced 1 to 2 times per day for specific beaches along the Florida Gulf Coast,” said Stumpf. “Previous to this project, these forecasts were issued at most twice a week, and only as general statements about risk within a county. The combination of earth observations with rapid field monitoring will increase the accuracy and usefulness of the forecasts.”

The research team that developed the HABscope app included oceanographers, ecologists, computer application developers, and public health experts. Photo Credit: Mote Marine Laboratory.