Staying Afloat in a Sea of Data

February 2nd, 2018 by Jake Bleacher

One aspect of our work is studying is how different types of information can be combined to help the scientists understand the site from during and after an EVA. We brought an array of instruments and cameras, which I’ll describe below. We also brought a collaborator from Canada, Ben Feist, to explore ways to combine data to make the scenery and the work easy to visualize. He has done this before with Apollo 17 data, providing an immersive, you-are-here feeling.

Our collaborator Ben Feist. Courtesy of Ben Feist.

So what did we bring? To give us the big picture, we brought two unmanned aerial vehicles, one like a robotic plane (see image below), and the other a quad-copter. Both gave us excellent 2D horizontal views, and the data can even be used to make 3D models of the terrain. Butch Wilmore and Liz Rampe also wore cameras during their simulated EVAs.

The robotic plane is ready for launch at Kilbourne Hole. Courtesy of Ben Feist.

The unmanned aerial vehicle is great for horizontal views, but it’s not as helpful for viewing vertical surfaces. For a task like mapping cliffs at Kilbourne, or lava pits at Aden Crater, we need to combine the airborne instruments with surface cameras that also provide 3D views. The instrument we brought for this purpose was a lidar, which is similar in concept to radar except that it uses laser pulses to measure distances. Mounted on a tripod, the instrument scans across a panorama and then swivels all the way back to the starting point to take a series of pictures covering the same area. After the operator sets up the sweep, everyone has to make sure they stay out of the field of view while the lidar scans, a maneuver the team calls the “lidar dance.”

The lidar on its tripod. NASA/GSFC

Team members Patrick Whelley (left) and Jacob Richardson (right) set up the tripod-mounted lidar. NASA/GSFC

The team also set up a hyperspectral camera, which shows us what the landscape looks like in the thermal infrared region of the spectrum, where heat is sensed. This camera can identify rocks and soils based on the amount of energy emitted at various infrared wavelengths. This can be helpful for prioritizing which areas to target first and for documenting a site and providing the geologic context for the samples taken.

Team member Deanne Rogers from Stony Brook University explains the use of the hyperspectral camera. NASA/GSFC

To investigate the chemistry of the rocks, we brought a handheld X-ray fluorescence instrument, or XRF, which bombards a sample with high-energy radiation to measure how much the material fluoresces. That gives us information about the composition of the sample. We also brought a handheld laser-induced breakdown spectrometer, or LIBS, which vaporizes a small sample of rock to give us information about its composition.

Team member Kelsey Young explains the use of the X-ray fluorescence spectrometer. NASA/GSFC

Team member Amy McAdam explains the use of the laser induced-breakdown spectrometer. NASA/GSFC

Tags: , , ,

Leave a Reply

Keep comments relevant. Inappropriate or offensive comments may be edited and/or deleted. Avoid adding Web site urls.

Notes from the Field