Light can come in many frequencies, only a small fraction of which can be seen by humans. Between the invisible low-frequency radio waves used by cell phones and the high frequencies associated with infrared light lies a fairly wide swath of the electromagnetic spectrum occupied by what are called terahertz, or sometimes submillimeter, waves. Exploitation of these waves could lead to many new applications in fields ranging from medical imaging to astronomy, but terahertz waves have proven tricky to produce and study in the laboratory. Now, Caltech chemists have created a device that generates and detects terahertz waves over a wide spectral range with extreme precision, allowing it to be used as an unparalleled tool for measuring terahertz waves.
The new device is an example of what is known as a frequency comb, which uses ultrafast pulsed lasers, or oscillators, to produce thousands of unique frequencies of radiation distributed evenly across a spectrum like the teeth of a comb. Scientists can then use them like rulers, lining up the teeth like tick marks to very precisely measure light frequencies. The first frequency combs, developed in the 1990s, earned their creators (John Hall of JILA and Theordor Hánsch of the Max Planck Institute of Quantum Optics and Ludwig Maximilians University Munich) the 2005 Nobel Prize in physics. These combs, which originated in the visible part of the spectrum, have revolutionized how scientists measure light, leading, for example, to the development of today’s most accurate timekeepers, known as optical atomic clocks.
The team at Caltech combined commercially available lasers and optics with custom-built electronics to extend this technology to the terahertz, creating a terahertz frequency comb with an unprecedented combination of spectral coverage and precision. Its thousands of “teeth” are evenly spaced across the majority of the terahertz region of the spectrum (0.15-2.4 THz), giving scientists a way to simultaneously measure absorption in a sample at all of those frequencies.
The work is described in a paper that appears in the online version of the journal Physical Review Letters and will be published in the April 24 issue. The lead author is graduate student and National Science Foundation fellow Ian Finneran, who works in the lab of Geoffrey A. Blake, professor of cosmochemistry and planetary sciences and professor of chemistry at Caltech.
Blake explains the utility of the new device, contrasting it with a common radio tuner. “With radio waves, most tuners let you zero in on and listen to just one station, or frequency, at a time,” he says. “Here, in our terahertz approach, we can separate and process more than 10,000 frequencies all at once. In the near future, we hope to bump that number up to more than 100,000.”
That is important because the terahertz region of the spectrum is chock-full of information. Everything in the universe that is warmer than about 10 degrees Kelvin (-263 degrees Celsius) gives off terahertz radiation. Even at these very low temperatures molecules can rotate in space, yielding unique fingerprints in the terahertz. Astronomers using telescopes such as Caltech’s Submillimeter Observatory, the Atacama Large Millimeter Array, and the Herschel Space Observatory are searching stellar nurseries and planet-forming disks at terahertz frequencies, looking for such chemical fingerprints to try to determine the kinds of molecules that are present and thus available to planetary systems. But in just a single chunk of the sky, it would not be unusual to find signatures of 25 or more different molecules.
To be able to definitively identify specific molecules within such a tangle of terahertz signals, scientists first need to determine exact measurements of the chemical fingerprints associated with various molecules. This requires a precise source of terahertz waves, in addition to a sensitive detector, and the terahertz frequency comb is ideal for making such measurements in the lab.
“When we look up into space with terahertz light, we basically see this forest of lines related to the tumbling motions of various molecules,” says Finneran. “Unraveling and understanding these lines is difficult, as you must trek across that forest one point and one molecule at a time in the lab. It can take weeks, and you would have to use many different instruments. What we’ve developed, this terahertz comb, is a way to analyze the entire forest all at once.”
After the device generates its tens of thousands of evenly spaced frequencies, the waves travel through a sample—in the paper, the researchers provide the example of water vapor. The instrument then measures what light passes through the sample and what gets absorbed by molecules at each tooth along the comb. If a detected tooth gets shorter, the sample absorbed that particular terahertz wave; if it comes through at the baseline height, the sample did not absorb at that frequency.
“Since we know exactly where each of the tick marks on our ruler is to about nine digits, we can use this as a diagnostic tool to get these frequencies really, really precisely,” says Finneran. “When you look up in space, you want to make sure that you have such very exact measurements from the lab.”
In addition to the astrochemical application of identifying molecules in space, the terahertz comb will also be useful for studying fundamental interactions between molecules. “The terahertz is unique in that it is really the only direct way to look not only at vibrations within individual large molecules that are important to life, but also at vibrations between different molecules that govern the behavior of liquids such as water,” says Blake.
Additional coauthors on the paper, “Decade-Spanning High-Precision Terahertz Frequency Comb,” include current Caltech graduate students Jacob Good, P. Brandon Carroll, and Marco Allodi, as well as recent graduate Daniel Holland (PhD ’14). The work was supported by funding from the National Science Foundation.
At Caltech’s Ronald and Maxine Linde Center for Global Environmental Science, researchers from diverse disciplines work together to investigate Earth’s climate and its atmosphere, oceans, and biosphere; their evolution; and how they may change in the future.
In early February, the center hosted a three-day workshop focused on the Southern Ocean around Antarctica. Scientists from around the world working at the intersection of fluid dynamics and biochemistry gathered to summarize our current knowledge of the physical, chemical, and biological processes that are critical to the Southern Ocean’s circulation and marine ecosystems. The researchers set out to identify areas where collaboration across disciplines is needed to push that understanding forward. Here are a few of the topics they covered.
The Use of Autonomous Underwater Vehicles for Observation
Credit: Sunke Schmidtko
The Southern Ocean is one of the most inhospitable places on Earth. Despite the area’s importance to the global climate, measurements and data are hard to come by because it is difficult to deploy research vessels in the region, especially in winter. Little, if any, data have been collected in some areas, especially in the deep ocean and underneath ice shelves.
But many new tools now exist to improve data collection and measurement in these remote regions. Autonomous gliders (shown above) have gathered information on currents, water density, and temperature at many depths, helping researchers like workshop participants Nicole Couto (Rutgers University), Mike Meredith (British Antarctic Survey), as well as Caltech’s Andrew Thompson, assistant professor of environmental science and engineering, understand how warm waters are causing ice sheets to melt. Meanwhile, an extensive system of autonomous floats monitors temperature, salinity, dissolved gases and currents in the earth’s oceans; moored instruments track what is happening beneath ice shelves; and even Antarctic seals outfitted with sensors provide scientists access to, and information about, some of the ocean’s coldest and most inaccessible waters.
Iron Limitation on Phytoplankton Growth
Credit: NASA/Suomi NPP/Norman Kuring
Phytoplankton, microscopic algae that perform photosynthesis, the base of the Southern Ocean food web. These organisms require both nutrients and sunlight to survive. The Southern Ocean is a region where nutrients and sunlight (at least in summer) are plentiful, yet many parts of the Southern Ocean have extremely low phytoplankton concentrations. This is because not all nutrients are treated equally. Take iron, for example. Although iron is needed only in small amounts by phytoplankton, it is scarce throughout most of the Southern Ocean. Iron enters ocean waters by way of dust falling out of the atmosphere, from melting icebergs or glaciers, and from the ocean floor. Meeting participants Phil Boyd (University of Tasmania) and Nicolas Cassar (Duke University) are working to understand how sources of iron will respond to changing atmospheric and oceanic conditions, as well as how Southern Ocean ecosystems will adapt, are important research questions.
Phytoplankton distributions are largely observed by measuring ocean color from space. This image shows data from NASA’s MODIS (MODerate resolution Imaging Spectroradiometer) satellite, which measures light coming off the ocean, NASA scientists use this information to determine the concentration of phytoplankton in the water. Here, yellow and orange colors indicate the presence of more phytoplankton.
The Importance of High Spatial Resolution in Ocean Models
Credit: Jeff Schmalz/NASA
The ocean is similar to the atmosphere in that much of the variability is contained in “weather systems,” or high- and low-pressure areas. These weather systems create swirling currents, called eddies, that are the ocean equivalent of atmospheric storms. While storms in the atmosphere span hundreds of kilometers, eddies in the ocean only cover a few tens of kilometers. When numerical models, such as those run by meeting participant Andy Hogg (Australia National University), capture these smaller scales, the simulations explode with previously unseen dynamics and produce an energetic circulation that is more vigorous than seen in models that only simulate larger scales.
This image of Chatham Island, off the coast of New Zealand, was taken by MODIS. The blue wispy pattern (upper right) is a phytoplankton bloom that is being stretched and stirred by ocean eddies. Images like this one verify that high-resolution numerical models accurately reproduce oceanic motions and provide insight into how these small-scale currents influence Southern Ocean ecosystems.
Credit: Courtesy of Whit Anderson/The Geophysical Fluid Dynamics Lab in Princeton, NJ
Increasing carbon dioxide concentrations in the atmosphere warm the planet, with roughly 90 percent of the extra energy going into the oceans. The ocean warming that results is not uniform around the globe. Numerical models from the group of meeting participant John Marshall (MIT) suggest that the warming of the Southern Ocean will occur later than that of other oceans. The reason? The Southern Ocean provides a gateway where cold, dense waters, stored in the deep ocean, are brought up to the surface by the ocean circulation and are exposed to the atmosphere. These cold waters have the potential to store a large amount of heat. Understanding when this reservoir will be exhausted is critical to predicting future Southern Ocean temperature changes.
In this sea-surface temperature map created by a NOAA Geophysical Fluid Dynamics Laboratory model, Southern Ocean waters (green and blue) represent regions where cold water rises up to the surface, warms, and moves northward.
The Distribution of Sea Ice
Credit: Hannah Joy-Warren, Stanford graduate student, taken during the Phantastic II cruise to the west Antarctic Peninsula (October/November 2014).
The distribution of sea ice in the Southern Ocean is important for many reasons. For instance, sea ice can act as a cap on the ocean, limiting atmospheric interactions with the ocean surface that may trap carbon in the deep ocean. Recently, Caltech researchers including Thompson and Jess Adkins, professor of geochemistry and global environmental science, discovered a link between the distribution of sea ice in the Southern Ocean and differences in the ocean circulation in our present climate and at the Last Glacial Maximum.
As sea ice retreats, additional melting can be a source of iron to the ocean, influencing phytoplankton growth. The capacity for plankton and other organisms to survive the Antarctic winter is only just beginning to be understood, as explained in a recent review article on sea ice ecosystems by meeting participant Kevin Arrigo (Stanford University). Future under-ice observations are needed to improve our ability to estimate ecosystem changes in polar regions.
Watching plants perform photosynthesis from space sounds like a futuristic proposal, but a new application of data from NASA’s Orbiting Carbon Observatory-2 (OCO-2) satellite may enable scientists to do just that. The new technique, which allows researchers to analyze plant productivity from far above Earth, will provide a clearer picture of the global carbon cycle and may one day help researchers determine the best regional farming practices and even spot early signs of drought.
When plants are alive and healthy, they engage in photosynthesis, absorbing sunlight and carbon dioxide to produce food for the plant, and generating oxygen as a by-product. But photosynthesis does more than keep plants alive. On a global scale, the process takes up some of the man-made emissions of atmospheric carbon dioxide—a greenhouse gas that traps the sun’s heat down on Earth—meaning that plants also have an important role in mitigating climate change.
To perform photosynthesis, the chlorophyll in leaves absorbs sunlight—most of which is used to create food for the plants or is lost as heat. However, a small fraction of that absorbed light is reemitted as near-infrared light. We cannot see in the near-infrared portion of the spectrum with the naked eye, but if we could, this reemitted light would make the plants appear to glow—a property called solar induced fluorescence (SIF). Because this reemitted light is only produced when the chlorophyll in plants is also absorbing sunlight for photosynthesis, SIF can be used as a way to determine a plant’s photosynthetic activity and productivity.
“The intensity of the SIF appears to be very correlated with the total productivity of the plant,” says JPL scientist Christian Frankenberg, who is lead for the SIF product and will join the Caltech faculty in September as an associate professor of environmental science and engineering in the Division of Geological and Planetary Sciences.
Usually, when researchers try to estimate photosynthetic activity from satellites, they utilize a measure called the greenness index, which uses reflections in the near-infrared spectrum of light to determine the amount of chlorophyll in the plant. However, this is not a direct measurement of plant productivity; a plant that contains chlorophyll is not necessarily undergoing photosynthesis. “For example,” Frankenberg says, “evergreen trees are green in the winter even when they are dormant.”
He adds, “When a plant starts to undergo stress situations, like in California during a summer day when it’s getting very hot and dry, the plants still have chlorophyll”—chlorophyll that would still appear to be active in the greenness index—”but they usually close the tiny pores in their leaves to reduce water loss, and that time of stress is also when SIF is reduced. So photosynthesis is being very strongly reduced at the same time that the fluorescence signal is also getting weaker, albeit at a smaller rate.”
The Caltech and JPL team, as well as colleagues from NASA Goddard, discovered that they could measure SIF from orbit using spectrometers—standard instruments that can detect light intensity—that are already on board satellites like Japan’s Greenhouse Gases Observing Satellite (GOSAT) and NASA’s OCO-2.
In 2014, using this new technique with data from GOSAT and the European Global Ozone Monitoring Experiment–2 satellite, the researchers scoured the globe for the most productive plants and determined that the U.S. “Corn Belt”—the farming region stretching from Ohio to Nebraska—is the most photosynthetically active place on the planet. Although it stands to reason that a cornfield during growing season would be actively undergoing photosynthesis, the high-resolution measurements from a satellite enabled global comparison to other plant-heavy regions—such as tropical rainforests.
“Before, when people used the greenness index to represent active photosynthesis, they had trouble determining the productivity of very dense plant areas, such as forests or cornfields. With enough green plant material in the field of view, these greenness indexes can saturate; they reach a maximum value they can’t exceed,” Frankenberg says. Because of the sensitivity of the SIF measurements, researchers can now compare the true productivity of fields from different regions without this saturation—information that could potentially be used to compare the efficiency of farming practices around the world.
Now that OCO-2 is online and producing data, Frankenberg says that it is capable of achieving higher resolution than the preliminary experiments with GOSAT. Therefore, OCO-2 will be able to provide an even clearer picture of plant productivity worldwide. However, to get more specific information about how plants influence the global carbon cycle, an evenly distributed ground-based network of spectrometers will be needed. Such a network—located down among the plants rather than miles above—will provide more information about regional uptake of carbon dioxide via photosynthesis and the mechanistic link between SIF and actual carbon exchange.
One existing network, called FLUXNET, uses ground-based towers to measure the exchange of carbon dioxide, or carbon flux, between the land and the atmosphere from towers at more than 600 locations worldwide. However, the towers only measure the exchange of carbon dioxide and are unable to directly observe the activities of the biosphere that drive this exchange.
The new ground-based measurements will ideally take place at existing FLUXNET sites, but they will be performed with a small set of high-resolution spectrometers—similar to the kind that OCO-2 uses—to allow the researchers to use the same measurement principles they developed for space. The revamped ground network was initially proposed in a 2012 workshop at the Keck Institute for Space Studies and is expected to go online sometime in the next two years.
In the future, a clear picture of global plant productivity could influence a range of decisions relevant to farmers, commodity traders, and policymakers. “Right now, the SIF data we can gather from space is too coarse of a picture to be really helpful for these conversations, but, in principle, with the satellite and ground-based measurements you could track the fluorescence in fields at different times of day,” he says. This hourly tracking would not only allow researchers to detect the productivity of the plants, but it could also spot the first signs of plant stress—a factor that impacts crop prices and food security around the world.
“The measurements of SIF from OCO-2 greatly extend the science of this mission”, says Paul Wennberg, R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering, director of the Ronald and Maxine Linde Center for Global Environmental Science, and a member of the OCO-2 science team. “OCO-2 was designed to map carbon dioxide, and scientists plan to use these measurements to determine the underlying sources and sinks of this important gas. The new SIF measurements will allow us to diagnose the efficiency of the plants—a key component of the sinks of carbon dioxide.”
By using OCO-2 to diagnose plant activity around the globe, this new research could also contribute to understanding the variability in crop primary productivity and also, eventually, the development of technologies that can improve crop efficiency—a goal that could greatly benefit humankind, Frankenberg says.
This project is funded by the Keck Institute for Space Studies and JPL. Wennberg is also an executive officer for the Environmental Science and Engineering (ESE) program. ESE is a joint program of the Division of Engineering and Applied Science, the division of Chemistry and Chemical Engineering, and the Division of Geological and Planetary Sciences.
Using a combination of satellite radar imaging data, GPS data measured in and near Nepal, and seismic observations from instruments around the world, Caltech and JPL scientists have constructed a preliminary picture of what happened below Earth’s surface during the recent 7.8-magnitude Gorkha earthquake in Nepal.
The team’s observations and models of the April 25, 2015 earthquake, produced through the Advanced Rapid Imaging and Analysis (ARIA) project—a collaboration between Caltech and JPL—include preliminary estimates of the slippage of the fault beneath Earth’s surface that resulted in the deaths of thousands of people. In addition, the ARIA scientists have provided first responders and key officials in Nepal with information and maps that show block-by-block building devastation as well as measurements of ground movement at individual locations around the country.
“As the number of orbiting imaging radar and optical satellites that form the international constellation increases, the expected amount of time it takes to acquire an image of an impacted area will decrease, allowing for products such as those we have made for Nepal to become more commonly and rapidly available,” says Mark Simons, professor of geophysics at Caltech and a member of the ARIA team. “I fully expect that within five years, this kind of information will be available within hours of a big disaster, ultimately resulting in an ability to save more lives after a disaster and to make assessment and response more efficient in both developed and developing nations.”
Over the last five years, Simons and his colleagues in Caltech’s Seismological Laboratory and at JPL have been developing the approaches, infrastructure, and technology to rapidly and automatically use satellite-based observations to measure the movement of Earth’s surface associated with earthquakes, volcanoes, landslides and other geophysical processes.
“ARIA is ultimately aimed at providing tools and data—for use by groups ranging from first responders, to government agencies, and individual scientists—that can help improve situational awareness, response, and recovery after many natural disasters,” Simons says. “The same products also provide key observational constraints on our physical understanding of the underlying processes such as the basic physics controlling seismogenic behavior of major faults.”
ARIA is funded through a combination of support from JPL, Caltech, and NASA.
The Scene at Mt. Lemmon
Kim Seng/FlickrRead More
Chris Combe/FlickrRead More
San Fran Sunrise
David Yu/FlickrRead More
Taro Taylor/FlickrRead More