The boreal forest is Earth’s northernmost forest. It circles the Earth at high latitudes, covering large parts of Russia, Canada, Scandinavia, and Alaska. These forests are some of the largest intact forests on Earth. They store a large amount of carbon—as much as (if not more than) is stored in tropical forests. The trees use photosynthesis to grow, and in the process, they take in carbon dioxide from the atmosphere. Carbon dioxide is an important greenhouse gas and a driver of global climate change, which is why it is vital to understand the rates of the transfer and storage of carbon between the atmosphere and the boreal forest.
Due to their vast area and remoteness, boreal forests are difficult to monitor from the ground. Data from satellites provide the means for observing these forests’ condition and detecting change across wide areas. In our project, “Clarifying Linkages Between Canopy Solar Induced Fluorescence (SIF) and Physiological Function for High Latitude Vegetation,” our team from University of Maryland Baltimore County, NASA’s Goddard Space Flight Center, and the University of Texas El Paso is working to develop advanced approaches to using satellite data to describe boreal forest productivity and detect stress responses. Our project is part of the NASA Terrestrial Ecology program’s Arctic-Boreal Vulnerability Experiment (ABoVE), a large-scale field study in Alaska and western Canada, whose overall goals are to make use of NASA technology to gain a better understanding of ecosystems at high latitudes, their responses to environmental change, and the effects of those changes.
A large proportion of the trees in the boreal forest are conifers, such as spruce trees (images above). These trees keep their green needles all year long. This makes it hard to determine when they start photosynthesizing in the spring, compared to deciduous trees, like oaks, where we can clearly see the growth of new green leaves in the spring.
The timing of the start of the growing season is key to determining the overall productivity of the forest and can be a useful predictor of possible stress events later in the summer. So, one of the goals of our project is to find ways to use light to detect when these evergreens “turn on” photosynthesis in the spring and actively start taking up carbon from the atmosphere.
This brought team members to Fairbanks, Alaska, right in the heart of the boreal forest. We arrived with our instruments in April 2023 to observe the very start of the boreal growing season. Our instruments use different methods to detect the onset and rate of photosynthetic activity in plants. One method we are using to identify photosynthetic activity in the evergreens is based on light that is actually emitted from the trees. Plants absorb light to power photosynthesis, but in the process of photosynthesis some of that light energy is radiated out from the plant; this is called chlorophyll fluorescence. This fluoresced light is very dim, which is why we don’t see plants glowing, but we can use sensitive instruments to measure fluorescence at leaf and canopy scales, which can even be done with instruments on satellites from space.
A second method is to detect very subtle changes in the color of the needles that are related to changes in the pigments in the leaves. Pigments such as chlorophyll, which makes leaves green, and carotenoids, which cause the yellow color of leaves in the fall, control potential rates of photosynthesis as well as provide protection to the leaves when stressed. These color changes are also subtle enough that we don’t see them with our naked eyes, but our instruments can measure and detect these pigment changes.
Instruments that are already on the flux tower measure the transfer of heat, moisture, and carbon dioxide between the atmosphere and the forest (images below).
With the help of Jeb Timm, a NEON tower lead technician, we mounted our FLoX (Fluorescence Box) on the top of the tower. The FLoX looks down on the forest and measures the reflected light and solar induced chlorophyll fluorescence every few minutes continuously through the growing season (images below). FLoX measurements are similar to the data satellites provide, but with far more detail.
From the flux data we can determine photosynthesis rates and compare them with our fluorescence and reflectance measurements for relating the remotely sensed optical measurements to forest productivity. The continuous measurements allow us to examine the effects of varying light levels, moisture, and temperatures on the forest.
Near the flux tower we also put our MONITORING-PAM (MoniPAM) instruments whose probes actively illuminate individual spruce shoots with controlled pulses of light to measure fluorescence and photosynthetic processes at the leaf level (images below).
Besides setting up our instruments to catch the start of the growing season, we were hoping to be around when the spruce started to photosynthesize. This would allow us to test if we could detect the onset of photosynthesis through changes in needle reflectance due to changing pigment pools and/or fluorescence measured using a special leaf clip. To get consistent measurements using the same amount of needles, we had to pull off the individual tiny needles then line them up to make a solid mat to measure. And because photosynthesis and fluorescence are temperature sensitive, we had to make our measurements at the temperature the needles experience, so we worked on the deck outside in the cold (images below). The deck looked out on a big white spruce that was full of busy red squirrels who chattered and scolded us while we made our measurements.
Unfortunately, the temperatures mostly stayed below freezing the entire time we were there, so we didn’t get a chance to measure needles as they became photosynthetically active.
While we were there, there was also a NASA funded study of snow called SnowEx. In this part of the SnowEx study, researchers were studying changes in snow characteristics during the thaw period. The SnowEx field team made measurements of the snow on the ground, and NASA flew the Airborne Visible-Infrared Imaging Spectrometer – Next Generation (AVIRIS-NG) imaging system on an airplane, collecting high resolution canopy spectroscopy measurements. We plan to make use of the airplane imagery in our study to see if we can identify changes in tree reflectance (which is noise to the snow scientists) indicating the start of photosynthesis.
We will return in late July to collect measurements during the period of peak summer forest productivity.
Plankton comes from the greek word planktos, meaning wanderer. It does not define a specific organism, but rather a specific life style. Plankton consist of all organisms dispersed in water that are passively driven by water currents or are subject to passive sinking process. Some of those organisms have an ability to produce oxygen and sugars using sunlight and CO2, just like terrestrial plants do. We call them phytoplankton (greek: phyton -plant ).
Phytoplankton are the wandering meadows of the ocean and an important base of the food web. Most of the phytoplankton are smaller than the width of the single human hair. They are feeding the hungry ocean, and the phytoplankton composition determines the diversity of organisms developing on the higher trophic levels like fish, birds and mammals. It is not only important for organisms living in the ocean, but it is crucial for all life on Earth. Consider the size of the oceans covering our blue planet for a moment – the amount of phytoplankton every second breath that we take the oxygen is produced by phytoplankton.
When it comes to phytoplankton, it is not just the quantity, but also the quality that matters. You may wonder how can researchers sample something that we cannot see that exists in an environment where we cannot be in? How can we catch something that is passively driven by currents and changing all the time? How can we get the insight of the abundance of some particles that are unevenly dispersed in an ocean? Since the discovery of the microscope scientists have been trying to find answers to those questions.
Here, on R/V Falkor we are combining traditional methods of phytoplankton analysis – such as preservation of water for later, onshore analysis under a microscope – with the new, recently developed methods of onboard image analyses. Discrete samples are taken with the Niskin bottles at the specified depth. The sampling depth is chosen according to the physical, chemical and biological characteristics of the water column that are measured by the instruments mounted on the rosette, and controlled from the science control room. Once the rosette is on deck, the samples are taken from the Niskin bottles and prepared to be either stored until they return to land, or analyzed on board. The great advantage of the onboard image analysis is that it lends an instantaneous snapshot of the phytoplankton composition and abundance, which allows you to adapt your sampling strategy and use the time spent on the ship in a better, more productive way.
The flow-through system onboard the R/V Falkor allows us for continuous sampling of the surface water for physical, chemical and biological properties, including phytoplankton composition with Imaging Flow Cytobot. This amazing instrument samples from the flow-through system every 20 minutes, and takes images of particles contained in 5 mL (100 drops) of seawater. Therefore, as we steam through the Pacific or as we stop to conduct some other measurements, we are continuously gathering high-spatial information about phytoplankton abundance and composition. The advantage of taking images vs. looking at the cells through the microscope is that you can always go back to your sample if needed. That helps us that help to have comparable results and minimizes the error of phytoplankton counting and taxonomy misidentification.
Will the modern methods of high-resolution imaging ever substitute traditional microscopy? I would say – no. While continuous sampling techniques give us amazing insight into the spatial distribution of the phytoplankton and inform best sampling strategies, classical microscopy gives us insight into the detailed morphological characteristics that are needed to be seen from multiple angles to really be understood. The phytoplankton taxonomy is under constant revision and change as the new methodologies develop. With more knowledge and deeper insight, we do find answers to existing questions, but we also encounter more questions that need to be answered. The combination of the traditional and modern methods is the best strategy to understand the secrets of these beautiful oceanic wonders.
Earth’s ocean is vast and deep, and we still need to study many things about it. To investigate and quantify biological and chemical processes, for instance, we need to determine the concentration and size of particles (living and non-living organisms) floating in the water, dissolved materials, and the diversity of organisms such as the microscopic photosynthetic phytoplankton. Their study requires both direct measurements by deploying instruments at sea or analyzing water samples, and satellite remote sensing.
Particles and dissolved organic materials scatter and absorb the sunlight that enters the ocean, which alters the ocean’s color. For instance, the first site that we sampled contained low abundances of particles including phytoplankton and dissolved organic materials, which translated to clear blue waters. Higher abundances of phytoplankton result in greener seas because of their chlorophyll pigments.
Since 1978, NASA has applied satellite remote sensing to study phytoplankton though its experimental Coastal Zone Color Scanner. Several other sensors followed from 1997 to the present. By 2022, NASA expects to launch the next generation ocean color satellite sensor for the Plankton, Aerosol, Cloud, and Ocean Ecosystem (PACE) which is currently being developed. This PACE sensor will provide unprecedented detail on the color spectrum and intensity of the light exiting the ocean’s surface, which will be used to infer a lot of information about our oceans, including the concentration and size of particles and dissolved organic materials, the diversity of phytoplankton, and rates of phytoplankton growth within the ocean’s sunlit surface layer.
To successfully apply the capabilities of the PACE sensor requires the development of relationships between ocean data (such as chlorophyll-a) and how it affects the color and the amount of light that will be measured by the satellite. One of our goals for participating on the Sea to Space Particle Investigation in the northeastern Pacific Ocean aboard Falkor is to collect biological, chemical and optical measurements in order to build these relationships.
To be able to do so, much of our work at sea involves development and evaluation of new methods and measurement capabilities to ensure that the data collected are of sufficient quality for application with satellite remote sensing. For example, to quantify phytoplankton growth rates, we are conducting experiments with phytoplankton and measuring the oxygen produced and carbon dioxide consumed over time.
Only satellite remote sensing can provide the comprehensive data sets across space and time needed to study the state of Earth’s vast ocean. The ocean moderates our weather, provides food, medicine, energy resources, recreation, and many other benefits. Improving our understanding of the ocean will help us better predict how it will change in the future.
I always knew that one day I wanted to study the ocean, even though I grew up just north of Pittsburgh and had never seen the ocean. After graduating high school, I attended the College of Charleston in South Carolina where my plan from the start was to major in Marine Biology. I began my junior year in college with no idea what I wanted to do with this very broad degree. Then I took the required oceanography course – after that, oceanography and phytoplankton (aquatic plant life that is microscopic) were in my life permanently.
For an undergraduate project, I measured the response of several species of phytoplankton to different light intensities by measuring the concentration of their photosynthetic pigments, the compounds that collect light for photosynthesis. Pigments can also be used to identify specific groups of phytoplankton. During the second year of my Master’s program for marine biology, I participated in my first research cruise on a Canadian Coast Guard vessel that sailed from Dutch Harbor to Barrow, Alaska. I got my first taste of filtering, which is the collection of particles onto glass fiber pads that can be used for various analyses. Despite my initial bout of sea sickness in a near flat sea state, I was hooked.
In 2007 I pursued a research opportunity at the Bermuda Institute of Ocean Sciences where I was ship-bound once a month measuring the sulfur-based compounds made by phytoplankton that are thought to enhance cloud formation when they are outgassed to the atmosphere. For years, what I knew about phytoplankton was based on chemistry and physiology.
In late 2008, I left the beautiful island of Bermuda (crazy, right?) for Maryland to work at NASA Goddard Space Flight Center. Before this time, I knew nothing about satellites or ocean color remote sensing. While working at NASA I have learned that everything in the ocean – dissolved compounds, phytoplankton, and particles – absorb and scatter sunlight. Using this information about the color of the light reflecting out of the ocean, we can translate this light into information about what types of phytoplankton are in the water column.
High temporal and spatial resolution observations in the global ocean are just not feasible as we are limited by time and resources. Therefore, we make use of additional tools to fill in the gap for global and regional oceanographic observations. Satellite ocean color observations provide global ocean coverage, reaching time and space beyond our capabilities with research vessels and, therefore, may fill in the data gap where field measurements are limiting.
Ground truthing of these measurements of phytoplankton types through ocean color remote sensing is necessary but challenging. We can use phytoplankton pigments to derive a certain amount of information but the addition of microscopy is ideal, as then we can see which species are in the water. One of the newer technologies in the field is imaging flow cytometry, a technology that combines the best aspects of microscopy, flow cytometry and digital imaging.
Water is fed through the instrument at a specific magnification wherein a camera can be triggered to take a digital image of each particle or phytoplankter that passed by the field of view. Imagine how high spatial resolution of these data will help us to ground truth the phytoplankton type products that we retrieve from satellite imaging. On the RV Falkor, we have two forms of this technology to sample, not only the surface of the ocean, but also at depth. Having never spent much time in front of a microscope myself, I am learning so much from the skilled scientists around me who can look at an image and almost immediately identify to which genus and/or species the phytoplankton belongs. I hope to gain this knowledge as I learn and use this instrumentation.
Melissa Omand, interdisciplinary physical oceanographer from the University of Rhode Island’s Graduate School of Oceanography, was confronted with a conflict: it was time for an upgrade to her phone, but creating more technological trash did not feel right. Plus, the camera on her older phone was fantastic. Together with her first graduate student Noah Walcutt, she worked on optimizing better battery life, as well as fabricating an underwater housing and a lighting system for her “outdated” gadget. This to her remains the best part of her job: creating and testing new instruments, as well as repurposing existing ones.
Melissa and Noah are working with two different novel instruments in this cruise. The first one is a time-lapse camera developed after repurposing her previous mobile. The phone will dangle at the base of a 150 meter wire, deployed as part of the Wirewalker assembly. For three or four days, the camera snaps pictures of the base of a sediment trap which collects falling particles called marine snow. Up until now, Melissa and her colleagues Colleen Durkin and Meg Estapa have been able to identify what kind of particles fall into the traps (and at what time this happens) by analyzing the material preserved in a special gel. They have also learned that particles fall in pulses as opposed to a steady flow. However, they are still not sure about which types of marine snow sink with each pulse, and how these are connected to the phytoplankton community above. They hope the images taken by the camera will provide a new piece to complete the puzzle that is carbon storage in the ocean.
A second novel instrument Melissa has brought on board is a holographic camera. Unlike traditional photography, a holographic image is obtained when a laser beam hits an object and either bounces off of it or goes through it, bending the light. More a computer than just a camera, the instrument combines diffraction data with math in order to reconstruct the light’s journey after interacting with the object (in this case, a planktonic organism). Tracing the behaviour of the light provides an enormous amount of information about the object’s characteristics in three dimensions. The result is an image that allows the experts to focus on different planes, and not in just one single depth of field.
This is interesting both because it vastly increases the volume sampled by enabling the scientists to choose where to focus on a picture that has already been taken, and in that it enables a very exciting application: Virtual Reality.
Cut a single hair a hundred times and you will get something resembling the size of a few microns. That is the resolution that the holographic camera can capture in a single photo: 16 photos per second, 100 particles per hologram in average, 40 thousand holograms per dip in the ocean. The numbers begin adding up fast, so Melissa knows it is time once again to create something from scratch: their own pathway for data management, processing and analysis. This is why she began working with Ben Knorlein, a computer scientist from the Center for Computation and Visualization from Brown University.
Not only is Ben in charge of designing an efficient way to deal with all of the information yielded by the holographic camera, but he is also the mastermind behind the software that allows scientists to step into the holograms and interact with the particles in a Virtual Reality environment. This has been the first time Ben has ever been at sea. He assists researchers in the deployment and recovery of scientific instrumentation, and more importantly, he is gaining a deeper understanding of what they are looking for, becoming familiar with their thought process and expectations. All of this experience is vital for Ben to improve the software so scientists can have faster access to the information they need to extract from each holographic image.
No other ship could have given the team the opportunity to work efficiently with these images on board. Falkor’sHigh Performance Computer enables Ben to process tens of thousands of images in a single day, generating data immediately. Each day Ben sits at his working station in the Dry Lab, fine tuning parameters and settings to offer Melissa and Noah new options. Once back ashore, this cruise’s intense collaboration will have made the trio tighter than ever, and they will walk away from Falkor carrying invaluable new information, instruments and software.