“Pho,” a cartoon character representing a photon of light from ICESat-2, illustrates how the science mission works. Watch the video here. Credit: NASA/Goddard/Savannah College of Art and Design et al.
The successful launch of a satellite is an exciting step for the scientists and engineers who have spent years dedicated to a mission. But there are still many more boxes to be checked, and anticipation builds as a satellite’s instruments are turned on and they produce what scientists call “first light” — the first time a satellite opens its “eyes” and delivers preliminary images or data.
After the successful launch of NASA’s Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) on September 15, 2018, “first light” was not a natural-color image of Earth like those that come from satellites such as Terra, Aqua, or Landsat. Rather, the satellite’s sole instrument—the Advanced Topographic Laser Altimeter System (ATLAS)—acquired measurements of surface elevation. The laser fired for the first time on September 30, and returned its first height measurements from across the Antarctic ice sheet on October 3.
This visualization of ICESat-2 data shows the first set of height measurements from the satellite as it orbited over the Antarctic ice sheet. Credit: NASA’s Goddard Space Flight Center
This elevation measurement, acquired on October 3, shows the height of the Antarctic ice sheet along a path starting in East Antarctica, passing close to the South Pole, and ending in West Antarctica. As explained in a NASA story about the measurement:
“When scientists analyze the preliminary ICESat-2 data, they examine what is called a “photon cloud,” or a plot of each photon that ATLAS detects. Many of the points on a photon cloud are from background photons — natural sunlight reflected off Earth in the exact same wavelength as the laser photons. But with the help of computer programs that analyze the data, scientists can extract the signal from the noise and identify height of the ground below.”
These charts show the first photon returns from the instrument’s six beams as the satellite orbited over Antarctica. The green lines represent the number of photons detected. The X axis indicates the amount of time it took the photons to get from the satellite to the ground and back again. Credit: Megan Bock/NASA’s Goddard Space Flight Center.
Donya Douglas-Bradshaw, the project manager for the ATLAS instrument, said in the NASA story:
“We were all waiting with bated breath for the lasers to turn on and to see those first photons return. Seeing everything work together in concert is incredibly exciting. There are a lot of moving parts and this is the demonstration that it’s all working together.”
Earlier this month, we showed a space-based view of a glory—a colorful, circular optical phenomenon caused by water droplets scattering light. The Moderate Resolution Imaging Spectroradiometer on the Terra and Aqua satellites can see only a cross section of the glory, making it appear in satellite imagery as two elongated bands parallel to the path of the satellite.
To the Earth-bound observer, however, glories take on a circular shape. You might have seen one while on an aircraft. From this perspective, passengers on the side of the plane directly opposite the Sun can sometimes see the plane’s shadow on the clouds below. This position is also where glories can be observed as the cloud’s water droplets scatter sunlight back toward a source of light.
Before aviation, the phenomenon was often seen by mountain climbers; the glory encircling the climber’s shadow on the clouds below. Today, pilots and passengers have a good chance of seeing them, earning the phenomenon the name “glory of the pilot” or “pilot’s halo.”
But you still have to be flying close enough to the cloud deck for the phenomenon to be visible, which is one of the reasons why they are frequently spotted by scientists and crew with NASA’s Operation IceBridge mission. The mission makes annual flights over Earth’s poles to map the ice; flights are relatively low, long, and frequent. Glories are not part of the mission’s science goals—in fact, clouds can interfere with the collection of science data. But they are on the list of natural wonders that IceBridge scientists witness in the field.
Jeremy Harbeck, a sea ice scientist at NASA’s Goddard Space Flight Center, snapped the top photograph of a glory on April 18, 2018, during an IceBridge science flight over the Chukchi Sea. (See the full image and other photographs shot by Harbeck during the Arctic 2018 IceBridge campaign here.)
“I remember having taken more images of glories, especially down over Antarctica, as we see them quite often down there,” Harbeck said. “From what I remember, they’re not outside the window all the time, but you can catch them here and there on flights when conditions are right.”
One such Antarctic glory is visible in the image above, snapped by Michael Studinger during an IceBridge flight on October 26, 2010. Read more about that image here.
Remember the year 2000? Bill Clinton was president of the United States, Faith Hill and Santana topped Billboard music charts, and the world’s computers had just “survived” the Y2K bug. It also was the year that NASA’s Terra satellite began collecting images of Earth.
Eighteen years later, the versatile satellite — with five scientific sensors — is still operating. For all of that time, the satellite’s Moderate Resolution Imaging Spectroradiometer (MODIS) has been collecting daily data and imagery of the Arctic — and the rest of the planet, too.
If you knew where to look and were willing to wait patiently for file downloads, the images have always been available on specialized websites used by scientists. But there was no quick-and-easy way for the public to browse the imagery. With the recent addition of the full record of MODIS data into NASA’s Worldview browser, checking on what was happening anywhere in the world on any day since 2000 has gotten much easier.
Say you want to check on the weather in your hometown on the day you or your child was born. Just navigate to the date on Worldview, and make sure that the MODIS data layer is turned on. (In the image below, you can tell the Terra MODIS data layer is on because it is light gray.)
This Worldview screenshot shows the first day that Terra MODIS collected data — February 24, 2000. The very first Terra scene showed northern Argentina and Chile. Credit: EOSDIS.
One of the things I love about having all this MODIS data at my fingertips is that it makes it possible to see the passage of relatively long periods of time in just a few minutes. Look, for instance, at the animation at the top of this page, generated by Delft University of Technology ice scientist Stef Lhermitte using Worldview.
Lhermitte summoned every natural-color MODIS image of the Arctic that Terra and Aqua (which also has a MODIS instrument) have collected since April 2003. The result — a product of 71,000 satellite overpasses — is a remarkable six-minute time capsule of swirling clouds, bursts of wildfire smoke, the comings and goings of snow, and the ebb and flow of sea ice.
Though beautiful, Lhermitte’s animation also has a troubling side to it. If you look carefully, you can see the downward trend in sea ice extent. Look, for instance, at mid-August and September 2012— the period when Arctic sea ice extent hit a record-low minimum of 3.4 million square miles. Between the heavy cloud cover, you will see lots of dark open water. Compare that to the same period in 2003, when the minimum extent was 6.2 million square miles. Scientists attribute the loss of sea ice to global warming.
NASA Earth Observatory chart by Joshua Stevens, using data from the National Snow and Ice Data Center.
Earth Matters had a conversation with Lhermitte to find why he made the clip and what stands out about it. MODIS images of notable events that Lhermitte mentioned are interspersed throughout the interview. All of the images come from the archives of NASA Earth Observatory, a website that was founded in 1999 in conjunction with the launch of Terra.
What prompted you to create this animation?
The extension of the MODIS record back to the beginning of the mission in the Worldview website triggered me to make the animation. As a remote sensing scientist, I often use Worldview to put things into context (e.g. for studying changes over ice sheets and glaciers). Previously, Worldview only had data until 2010.
What do you think are the most interesting events or patterns visible in the clip?
I think the strength of the video is that it contains so many of them, and it allows you to see them all in one video. The ones that are most striking to me are:
An Aqua MODIS image of a bloom in the Barents Sea on August 14, 2011. Image by Jeff Schmaltz, MODIS Rapid Response Team at NASA GSFC.
+ algal blooms in the Barents Sea
+ declining sea ice extent. You can see this both annually and over the longer term.
+ changing snow extent. You can see this each summer, especially over Canada and Siberia.
+ summer wildfire smoke in Canada (2004, 2005, 2009, 2014, 2017) and Russia (2006, 2011, 2012, 2013, 2014, 2016)
+ albedo reductions (reduction in brightness) over the Greenland Ice Sheet in 2010 and 2012 related to strong melt years.
+ overall eastward atmospheric circulation
+ the Grímsvötn ash plume (21 May 2011)
How did you make it? Was it difficult from a technical standpoint?
It was simple. I just downloaded the MODIS quicklook data from the Worldview archive using an automated script. Afterwards, I slightly modified the images for visualization purposes (e.g. overlaying country borders, clipping to a circular area). and stitched everything together in a video.
When you sit back and watch the whole video, how does it make you feel? On the one hand, I am fascinated by the beauty and complexity of our planet. On the other hand, as a scientist, it makes me want to understand its processes even better. The video shows so many different processes at different scales, from natural processes (annual changes in snow cover and the Vatnajökull ash plume) to climate change related changes (e.g. the long term decrease in sea ice).
Terra MODIS image of the eruption of Grímsvötn Volcano in Iceland on May 22, 2011. NASA image by Jeff Schmaltz, MODIS Rapid Response Team.
There are some gaps during the winter where the extent of the sea ice abruptly changes. Can you explain why? I used the standard reflectance products, which show the reflected sunlight. I decided to leave all dates out where part of the Arctic is without sunlight during satellite overpasses (approximately 10:30 a.m. and 1:30 p.m. local time). The missing data due to the polar night are very prominent if you compile the complete record including winter months, and I did not want it to distract the viewer from the more subtle changes in the video.
A Terra MODIS image of smoke and fires in Siberia on June 29, 2012. NASA image by Jeff Schmaltz, LANCE MODIS Rapid Response.
In some parts of the world, saline lakes are common features. Take, for instance, the image below, from our January 2017 article about fires in Argentina. But saline lakes are an environment unto themselves.
Lakes cover about 4 percent of the Earth’s land surface. Many of the largest ones (by area) are salty: Utah’s Great Salt Lake, the Caspian Sea (arguably the world’s biggest lake), Iran’s Lake Urmia, and the Dead Sea. Unlike marine and brackish waters, saline lakes typically form inland, and do not connect to the ocean. They tend to be ephemeral, filling with water in periods of increased rainfall, and drying out under the Sun.
Just how salty are these lakes? That depends on location and season. The Dead Sea has a salinity of 34 percent, while the Great Salt Lake varies between 10 and 30 percent; the same is true of Lake Urmia. (For comparison, open ocean waters average a 3.5 percent salt content.)
Australia’s scorching hot weather and scant rainfall make it a hotbed for saline lakes—thousands of them. In her story on the colorful salt lakes Down Under, my colleague Kathryn Hansen describes how they formed:
Millions of years ago, declines in rainfall caused river flows to ebb and river valleys to fill in with sediment. Wind then sculpted the loose sediment to form the lake basins that remain today. (The wind also sculpted some of the lighter sediments into parallel dunes that fringe each lake downwind to the east-southeast.) Some of the lakes now fill with runoff directly from the Stirling Range; others are controlled primarily by groundwater.
The lakes below in Western Australia range from pea soup-brown to pinkish in hue. Their color changes based on sediments, aquatic and terrestrial plant growth, water chemistry, algae, and hydrology.
Image: NASA Earth Observatory/Joshua Stevens
At Urmia, the rise and fall of lake also has an effect on water color:
The color changes have become common in the spring and early summer due to seasonal precipitation and climate patterns. Spring is the wettest season in northwestern Iran, with rainfall usually peaking in April. Snow on nearby mountains within the watershed also melts in the spring. The combination of rain and snowmelt sends a surge of fresh water into Lake Urmia in April and May. By July, the influx of fresh water has tapered off and lake levels begin to drop.
The fresh water in the spring drives salinity levels down, but the lake generally becomes saltier as summer heat and dryness take hold. That’s when the microorganisms show their colors, too.
While many salt lakes vary in size according to rainfall, some like Lake Urmia, have been shrinking in recent years.
Image: NASA Earth Observatory/Joshua Stevens
Hot, sunny days help create saline lakes by evaporating massive amounts of water, but salt lakes can also occur in cold climes. For instance, Don Juan Pond sits in Antarctica’s McMurdo Valley, where winter temperatures can drop to -50 degrees Celsius (-58 degrees Fahrenheit). Don Juan is so salty that waters rarely freeze. Its extreme environment resembles that of Mars. While the lake is far too salty and cold for even salt-loving brine shrimp, it does house microorganisms, Brown University geologist, Jay Dickson, told the NASA Earth Observatory.
“There is certainly biology in the vicinity of the pond and some evidence for biologic activity in the pond itself, but this activity could be explained by abiotic processes,” Dickson said. “Mars has a lot of salt and used to have a lot of water.”
Eighty five years ago today, Wilson Alwyn Bentley died of pneumonia. It was December 23, 1931, and outside his home in Jericho, Vermont, the sky was ripe for snow. His final diary entry read: “Cold north wind afternoon. Snow Flying.” It was the sort of weather he had lived for.
Bentley began to observe snow at 15 years old, when his mother bought him a microscope. It consumed his days, he later recalled:
When the other boys of my age were playing with popguns and sling-shots, I was absorbed in studying things under this microscope: drops of water, tiny fragments of stone, a feather dropped from a bird’s wing, a delicately veined petal from some flower. But always, from the very beginning, it was snowflakes that fascinated me most. The farm folks, up in this north country, dread the winter; but I was supremely happy.
Over the subsequent decades, Bentley photographed more than 5,000 snowflakes, and created the first photomicrograph of an ice crystal, combining a microscope and an accordion-shaped bellows camera. A professor of natural history who became known as “the snowflake man,” he demonstrated what is now a common adage: no two snowflakes are alike.
Photos: Snowflakes circa 1902 by Wilson Bentley. Courtesy of Wikimedia Commons.
The snow Bentley observed at a microscopic scale, satellites now see at the size of whole continents. The subject matter that fascinated him still provides fodder for heaps of scientific papers today.
December 2016 has brought a flurry of Bentley-esuqe weather to the U.S. East Coast, including snow and biting winds. But long before winter kicked into full gear in cities like Washington and New York, it painted landscapes around the world white. From Japan to Kazakhstan, NASA satellites observed snow from space. So if you’re dreaming of a white Christmas or waiting for the gingerbread cookies to bake, here are several images—some we previously published, and others that are quite fresh.
Image: NASA Earth Observatory/Joshua Stevens, using Landsat data from the U.S. Geological Survey.
Snow fell in the town of Ain Sefra for the first time in nearly four decades this December. Satellites show a thin, white veil covering the orange sand dunes of the northern Sahara. The Enhanced Thematic Mapper Plus (ETM+) on the Landsat 7 satellite captured this natural-color image of snow on December 19, 2016. On the ground, Photographer Karim Bouchetata captured the snow before it melted.
Image: NASA Earth Observatory/Joshua Stevens using VIIRS day-night band data from the Suomi National Polar-orbiting Partnership.
An Arctic air mass brought more snow to communities around the Great Lakes on December 14, 2016. The lake-effect snow comes on the heels of an earlier accumulation that piled up to several feet of snow in some areas, according to reports. Officials issued weather warnings and advisories from northeast Ohio to upstate New York.
Image: NASA Earth Observatory/Jeff Schmaltz, using MODIS data from LANCE/EOSDIS Rapid Response.
A plume of volcanic ash hangs over the Gulf of Alaska in this natural-color image. The plume is not the product of an active volcano; it contains re-suspended ash from the 1912 eruption of Novarupta, according to the Alaska Volcano Observatory.
Photograph: Astronaut photography from the Expedition 48 crew.
The white hills sprawling in every direction look like mounds built by snow plows, or massive hills of sugar. In fact, they’re the world’s largest gypsum dune field: the White Sands National Monument, located in southern New Mexico.
During the last Ice Age, melting snow and ice from the San Andres Mountains (west of the dunes) and the Sacramento Mountains (to the east) eroded minerals from the hillsides and carried them downhill to the basin below. As the climate warmed and the water evaporated, the basin remained full of selenite (the crystalline form of gypsum) and created the Alkali Flats. Over time, winds broke the crystals into sand grains, which built up into the dunes.
Image: NASA Earth Observatory/Joshua Stevens, using MODIS data from LANCE/EOSDIS Rapid Response.
Image: NASA Earth Observatory/Pola Lem, using MODIS data from LANCE/EOSDIS Rapid Response.
November 20: White hills appear amid green in England
Snow arrived in parts of northern England before a satellite took this image on November 20. According to reports, the same storm brought high winds and lashing rain farther south, in Kent and Sussex.
Image: NASA Earth Observatory/Pola Lem, using MODIS data from LANCE/EOSDIS Rapid Response.
November 17: White Kazakhstan
Broad swaths of Kazakhstan turned white before this Visible Infrared Imaging Radiometer Suite (VIIRS) image was taken on November 17, 2016. Parts of the country saw temperatures dipping well below freezing, with extreme wind chills. That’s not a surprise for locals; the country’s capital, Astana, often experiences frigid days in winter months.
Image: NASA Earth Observatory/Jesse Allen, using EO-1 ALI data provided courtesy of the NASA EO-1 team.
More than a year after the latest eruption at Iceland’s Holuhraun lava field, the newly-formed lava may still be toasty underneath. Although the basaltic rock formed a hard crust, according to volcanologists, the flow is probably still hot enough to prevent snow from building up atop it.
“You’re not going to freeze the lava flow,” said Erik Klemetti, a volcanologist at Denison University and author of the Eruptions blog at Wired magazine. “You need to wait for the soil to freeze to get snow to accumulate in temperate latitudes.”
Snow dusted parts of Nebraska on October 6, 2016. This natural-color image, acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite, shows a white blanket covering about 40 square miles (more than 100 square kilometers). Photos taken by highway cameras showed snow falling along parts of U.S. Route 83 on the afternoon of October 6.
November 21st, 2016 by Abbey Nastan, MISR team at JPL
The science team behind the Multi-angle Imaging SpectroRadiometer (MISR) on NASA’s Terra satellite frequently publishes special images called stereo anaglyphs. For example, you might have seen our recent series of anaglyphs celebrating the centennial of the National Park Service. But what exactly is an anaglyph, and how is one made from MISR data?
All methods of viewing images in three dimensions rely on the fact that our two eyes see things at slightly different angles; this is what gives us depth perception. As a simple demonstration, hold a finger at a short distance from your face, and close one eye at a time. You will notice that your finger appears to be in a different place with each eye. The horizontal distance between the two versions of your finger is called the parallax. Your brain interprets the amount of parallax to tell you how far away your finger is from your face—the greater the parallax, the closer your finger!
However, your brain can also be tricked into thinking that a perfectly flat picture is actually a three-dimensional object by presenting each eye with a slightly different version of the picture. The first 3D viewing technology was the stereoscope, originally invented by Sir Charles Wheatstone in 1838. The stereoscope takes two images viewed from slightly different angles and mounts them next to each other. The photos are viewed using fixed lenses that fool the brain into thinking that it is looking at one picture. Stereoscopes worked well, but their major drawback was that they could only be used by one person at a time.
In 1858, Joseph D’Almeida, a French physics professor, invented a method of showing stereoscopic images to many people at once using a lantern projector equipped with red and blue filters. The viewers wore red and blue goggles. Later, Louis Du Haron adapted this technique to allow anaglyphs to be printed and viewed on paper. In 1889, William Freise-Green created the first anaglyphic motion picture. These early 3D movies were nicknamed “plastigrams” and were very popular by the 1920s.
At the most basic level, anaglyphs work by superimposing images taken from two angles. The two images are printed in different colors, usually red and cyan. The viewer needs glasses with lenses in the same colors. The lenses are needed to filter out the unwanted image for each eye. So, if the image for the right eye is printed in red, the image can be seen through the cyan lens placed over the right eye, but not through the red lens over the left eye, and vice versa. The brain, seeing two different pictures through each eye, interprets this as a three-dimensional scene.
The reason why the MISR instrument can be used to make anaglyphs is because it has nine cameras, each fixed to point at a different angle. Therefore, as MISR passes over a particular feature on Earth, it captures nine images spanning a range of 140 degrees (diagram above). Any two of these images can be combined to make an anaglyph. The greater the angular difference between the images, the greater the resulting 3D effect; however, if the angular difference is too great, the brain will be unable to interpret the image.
Anaglyphs made with MISR must be rotated so that the north-south direction is roughly horizontal. Though this is inconvenient — we are used to viewing the Earth with north at the top — it is necessary because Terra flies from north to south, and MISR’s cameras are aligned to take images along that track. Therefore, the angular difference between the images is in the north-south direction. Since our eyes are arranged horizontally, the angular difference between the anaglyph images must be horizontal as well.
You can see this by comparing two versions of an anaglyph of Denali, Alaska (below). In the version with north upwards (left), the 3D effect does not work. But when the image is rotated so that north is to the left, suddenly the mountains pop out.
Anaglyphs are useful for science because they allow us to intuitively understand the three-dimensional structure of things like hurricanes and smoke plumes. For example, examine the three-panel image of Typhoon Nepartak below. (All three images have been rotated so north is to the left).
In the top, single-angle image, the eye of the storm appears to be quite deep due to the shadows, but otherwise it is difficult to determine how high the clouds are. Compare this to the middle image, which shows the results from MISR’s cloud top height product; it uses a computer algorithm to compare the data from multiple cameras and determine the geometry of the clouds. Now we can tell that clouds in the central part of the storm are very high (except for the eye), while the spiral cloud bands are slightly lower and there are very low clouds between the arms. However, understanding this data set requires us to interpret the color key and have at least a rudimentary idea of how 16 kilometers compares to 4 kilometers.
Now put on your red-blue glasses* and look at the anaglyph in the third image. All of the features are immediately understood by our brains. While it takes a few minutes (or paragraphs) of explanation to introduce a first-time viewer to MISR datasets, the red-blue glasses make it possible to enjoy the same experience with a simple image. This is why anaglyphs make great tools for scientists as well as for sharing unique views of Earth’s features with the public.
Editor’s note: If you don’t have a pair of red-blue glasses, this page lists companies that sell them. Or if you can find some red and blue plastic wrap, you can make your own. The instructions are here.
Photographs by Scott Kelly/NASA. Sunrise (upper); sunset (lower).
My colleagues and I spend most of our time looking for stories, images, and data related to the latest and greatest remote sensing science at NASA and beyond. This often leads us to rather technical scientific journals and obscure websites that are hardly known for their artistry.
But every now and then during the course of a workday, we stumble across an image that is simply so gorgeous that we can not resist sharing it. The first image above, tweeted from the International Space Station by astronaut Scott Kelly on January 13, captures the intense, raw beauty of a sunrise with an unforgettable gradient of yellow to red. About eight hours later, he tweeted the second image. “Day 292. Colors of #sunset. #GoodNight from @space_station! #YearInSpace,” Kelly said of the orange, teal, and blue horizontal lines that fade to black.
This was probably not Kelly’s only chance to capture a spectacular sunset and sunrise on January 13. The International Space Station travels at about 17,100 miles per hour, and orbits Earth about every 90 minutes—enough for astronauts to witness 16 sunrises and 16 sunsets each day.
“The sun truly ‘comes up like thunder,’ and it sets just as fast,” said Joseph Allen, an astronaut who logged more than 300 hours in space on the Space Shuttle in the 1980s. “Each sunrise and sunset lasts only a few seconds. But in that time you see at least eight different bands of color come and go, from a brilliant red to the brightest and deepest blue.”
Curious to see more sunsets and sunrises from space? In the image below, see how a sunset reveals different layers of the atmosphere. Learn more about the image here. See several more of Kelly’s sunrise and sunset photographs featured by The Atlantichere.And if you still want more space sunrises and sunsets, check out our archives.