Several readers have asked us to post satellite imagery related to the earthquake that struck Nepal on April 25, 2015. While we regularly post imagery of natural hazards, the weather and the satellites haven’t cooperated in this case.
Some people assume NASA’s satellite fleet can collect images of virtually any part of the world in near-real time, but the reality is more complicated. The orbital track of the satellites and the specific capabilities of the sensors on board determine whether we have imagery to share. In the case of Nepal, things haven’t lined up in our favor.
NASA did acquire imagery of Nepal soon after the earthquake. The Aqua and Terra satellites capture images of Nepal every day with their identical Moderate Resolution Imaging Spectroradiometer (MODIS) sensors. Note, however, the words “moderate resolution” in the name. Each MODIS pixel corresponds to 250 meters of the Earth — not 1 meter or less like you will find if you zoom all the way in on Google maps. MODIS does a fantastic job of showing a broad area, but if you compare an April 22 MODIS view of Nepal with an April 27 view, you’ll see the sensor doesn’t have enough spatial resolution to see changes caused by the earthquake. What’s more, it has been rather cloudy since the earthquake anyway.
Other sensors like the Advanced Land Imager (ALI) on Earth Observing-1, the Operational Land Imager (OLI) on Landsat 8, and the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on Terra have much higher spatial resolution (10, 15, and 15 meters per pixel respectively…good enough to see individual buildings). But each satellite passes over Nepal much less frequently. OLI, for instance, captured imagery of Nepal on April 23, but it isn’t due for another pass until May 9. ALI did get an image of Mount Everest on April 28, but as shown in the images above, there’s no noticeable sign of the earthquake and avalanche due to a fresh coating of snow and some cloud cover. ASTER also was clouded out.
It’s also possible for the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi NPP satellite to detect the widespread blackouts that have occurred since the earthquake but, again, the weather has not cooperated. As you can see in the images (below) tweeted by NASA researcher Miquel Roman, clouds blocked the satellite’s view on April 25 (below), 26, and 27.
Why doesn’t NASA have sensors with extremely high spatial resolution (less than one meter per pixel) that like some commercial satellite companies do? (Some of those satellites have glimpsed damage to individual structures and shown groups of people congregating in streets.) That’s a complicated subject that would need a much longer blog post to explore properly, but the short answer is that NASA’s emphasis is on the broad view—using medium- and low-resolution imagers to understand macro scale processes on Earth.
NASA sensors are sometimes useful for disaster response and often provide a unique and memorable view of an event like a landslide or wildfire. Yet the strength of satellites like Terra, Aqua, Aura, Landsat, CALIPSO, Cloudsat, GPM, OCO-2, Aquarius, and GRACE is that they drive cutting-edge science by providing global perspective. Want a global map of the world’s fires? Or global view of sea surface temperatures? A map of ground water? A record of how Arctic ice has changed over decades? A view through a smoke plume as it drifts from Asia to North America? A three-dimensional perspective on the world’s forests? That’s where the NASA satellite fleet shines. For high-resolution imagery of specific events…well, there are plenty of other organizations that specialize in that.