Friction Means Antarctic Glaciers More Sensitive to Climate Change Than We Thought

By photosearth / May 25, 2015

News Writer: 
Ker Than

Credit: Courtesy Karen Heywood

One of the biggest unknowns in understanding the effects of climate change today is the melting rate of glacial ice in Antarctica. Scientists agree rising atmospheric and ocean temperatures could destabilize these ice sheets, but there is uncertainty about how fast they will lose ice.

The West Antarctic Ice Sheet is of particular concern to scientists because it contains enough ice to raise global sea level by up to 16 feet, and its physical configuration makes it susceptible to melting by warm ocean water. Recent studies have suggested that the collapse of certain parts of the ice sheet is inevitable. But will that process take several decades or centuries?

Research by Caltech scientists now suggests that estimates of future rates of melt for the West Antarctic Ice Sheet—and, by extension, of future sea-level rise—have been too conservative. In a new study, published online on March 9 in the Journal of Glaciology, a team led by Victor Tsai, an assistant professor of geophysics, found that properly accounting for Coulomb friction—a type of friction generated by solid surfaces sliding against one another—in computer models significantly increases estimates of how sensitive the ice sheet is to temperature perturbations driven by climate change.

Unlike other ice sheets that are moored to land above the ocean, most of West Antarctica’s ice sheet is grounded on a sloping rock bed that lies below sea level. In the past decade or so, scientists have focused on the coastal part of the ice sheet where the land ice meets the ocean, called the “grounding line,” as vital for accurately determining the melting rate of ice in the southern continent.

“Our results show that the stability of the whole ice sheet and our ability to predict its future melting is extremely sensitive to what happens in a very small region right at the grounding line. It is crucial to accurately represent the physics here in numerical models,” says study coauthor Andrew Thompson, an assistant professor of environmental science and engineering at Caltech.

Part of the seafloor on which the West Antarctic Ice Sheet rests slopes upward toward the ocean in what scientists call a “reverse slope gradient.” The end of the ice sheet also floats on the ocean surface so that ocean currents can deliver warm water to its base and melt the ice from below. Scientists think this “basal melting” could cause the grounding line to retreat inland, where the ice sheet is thicker. Because ice thickness is a key factor in controlling ice discharge near the coast, scientists worry that the retreat of the grounding line could accelerate the rate of interior ice flow into the oceans. Grounding line recession also contributes to the thinning and melting away of the region’s ice shelves—thick, floating extensions of the ice sheet that help reduce the flow of ice into the sea.

According to Tsai, many earlier models of ice sheet dynamics tried to simplify calculations by assuming that ice loss is controlled solely by viscous stresses, that is, forces that apply to “sticky fluids” such as honey—or in this case, flowing ice. The conventional models thus accounted for the flow of ice around obstacles but ignored friction. “Accounting for frictional stresses at the ice sheet bottom in addition to the viscous stresses changes the physical picture dramatically,” Tsai says.

In their new study, Tsai’s team used computer simulations to show that even though Coulomb friction affects only a relatively small zone on an ice sheet, it can have a big impact on ice stream flow and overall ice sheet stability.

In most previous models, the ice sheet sits firmly on the bed and generates a downward stress that helps keep it attached it to the seafloor. Furthermore, the models assumed that this stress remains constant up to the grounding line, where the ice sheet floats, at which point the stress disappears.

Tsai and his team argue that their model provides a more realistic representation—in which the stress on the bottom of the ice sheet gradually weakens as one approaches the coasts and grounding line, because the weight of the ice sheet is increasingly counteracted by water pressure at the glacier base. “Because a strong basal shear stress cannot occur in the Coulomb model, it completely changes how the forces balance at the grounding line,” Thompson says.

Tsai says the idea of investigating the effects of Coulomb friction on ice sheet dynamics came to him after rereading a classic study on the topic by American metallurgist and glaciologist Johannes Weertman from Northwestern University. “I wondered how might the behavior of the ice sheet differ if one factored in this water-pressure effect from the ocean, which Weertman didn’t know would be important when he published his paper in 1974,” Tsai says.

Tsai thought about how this could be achieved and realized the answer might lie in another field in which he is actively involved: earthquake research. “In seismology, Coulomb friction is very important because earthquakes are thought to be the result of the edge of one tectonic plate sliding against the edge of another plate frictionally,” Tsai said. “This ice sheet research came about partly because I’m working on both glaciology and earthquakes.”

If the team’s Coulomb model is correct, it could have important implications for predictions of ice loss in Antarctica as a result of climate change. Indeed, for any given increase in temperature, the model predicts a bigger change in the rate of ice loss than is forecasted in previous models. “We predict that the ice sheets are more sensitive to perturbations such as temperature,” Tsai says.

Hilmar Gudmundsson, a glaciologist with the British Antarctic Survey in Cambridge, UK, called the team’s results “highly significant.” “Their work gives further weight to the idea that a marine ice sheet, such as the West Antarctic Ice Sheet, is indeed, or at least has the potential to become, unstable,” says Gudmundsson, who was not involved in the study.

Glaciologist Richard Alley, of Pennsylvania State University, noted that historical studies have shown that ice sheets can remain stable for centuries or millennia and then switch to a different configuration suddenly.

“If another sudden switch happens in West Antarctica, sea level could rise a lot, so understanding what is going on at the grounding lines is essential,” says Alley, who also did not participate in the research.

“Tsai and coauthors have taken another important step in solving this difficult problem,” he says.

Along with Tsai and Thompson, Andrew Stewart, an assistant professor of atmospheric and oceanic sciences at UCLA, was also a coauthor on the paper, “Marine ice sheet profiles and stability under Coulomb basal conditions.” Funding support for the study was provided by Caltech’s President’s and Director’s Fund program and the Stanback Discovery Fund for Global Environmental Science.

Caltech News tagged with “GPS”

Orphan Elected Fellow of American Academy of Microbiology

By photosearth / May 25, 2015

News Writer: 
Lori Dajose

Victoria Orphan

Professor of Geobiology Victoria Orphan has been elected to the American Academy of Microbiology. Fellows are elected through a highly selective peer-review process to recognize scientific achievement and “original contributions that have advanced microbiology.”

“It’s a great honor to receive this award, and there’s also a nostalgic component,” Orphan says. “The first microbiology conference I attended was the American Academy of Microbiology meeting in New Orleans 20 years ago. This year, the location has cycled back to New Orleans, and that’s where I’ll be receiving this award. It has been a great journey.”

For the past 20 years, Orphan has studied anaerobic marine microorganisms that live within the seafloor and breathe methane. Through their unusual metabolism, these organisms restrict the amount of methane that seeps into the ocean and atmosphere. Methane is a much stronger greenhouse gas than carbon dioxide, so understanding how it cycles through the oceans and atmosphere is an important component of modeling Earth’s climate.

Recently, Orphan and her team discovered evidence that these microbes inhabit not only sediments on the ocean floor but also huge calcium carbonate mounds that can rise hundreds of feet above the seafloor. The mounds represent a previously unrecognized biological sink for methane that could be preventing large amounts of the potent greenhouse gas from reaching the atmosphere.

Orphan is one of 79 other microbiologists elected as fellows to the academy in 2015. She joins current fellows Jared Leadbetter, professor of environmental microbiology, and Dianne Newman, professor of biology and geobiology, and investigator at the Howard Hughes Medical Institute.

Caltech News tagged with “GPS”

Kanamori Receives Sacred Treasure from Japanese Government

By photosearth / May 25, 2015

News Writer: 
Lori Dajose

Hiroo Kanamori, Caltech’s John E. and Hazel S. Smits Professor of Geophysics, Emeritus, has been awarded the Order of the Sacred Treasure Gold and Silver Star by the government of Japan for his “contribution to education and research.”

The Order of the Sacred Treasure was introduced by the Japanese government in 1888 to recognize outstanding achievements in myriad areas, including research, education, business, and health care. After receiving his undergraduate degree and PhD from the University of Tokyo, Kanamori spent 10 years as a researcher and professor at the university’s Geophysics Department and Earthquake Research Institute. He returned as a visiting lecturer this year.

Kanamori moved to Caltech in 1972, where he worked on the mechanism of world great earthquakes and developed in 1977 a way of quantifying an earthquake in terms of the amount of energy it releases. His current research is on the physics of earthquakes, and he is also working on new detection methods for early warning systems.

Kanamori was nominated by the Japanese Ministry of Education, Culture, Sports, Science and Technology. He received the award from Japan’s Prime Minister, Shinzo Abe, in a conferment ceremony in Tokyo on November 5, 2014. The Emperor of Japan, Tsugunomiya Akihito, was also present.

“I enjoyed seeing the emperor,” says Kanamori. “He is a good scientist himself.”

Kanamori has also been named the 2014 recipient of the William Bowie Medal, the highest award given by the American Geophysical Union (AGU), which he received at the 47th annual meeting of the AGU on December 17, 2014, in San Francisco.

Caltech News tagged with “earthquakes”

Remembering Don L. Anderson

By photosearth / May 25, 2015

News Writer: 
Katie Neith

Professor of Geophysics and Director of the Caltech Seismological Laboratory Don Anderson studies Earth's deep interior (1988 photo).
Credit: Caltech Archives

Don L. Anderson, the Eleanor and John R. McMillan Professor of Geophysics, Emeritus, passed away on December 2, 2014. He was 81 years old.

Anderson’s work helped advance our understanding of the composition, structure, and dynamics of the earth and of earth-like planets. He was a pioneer in the use of seismic anisotropy—variations in the velocities of seismic waves as they move at different angles through materials—to study the earth’s interior. This allowed him and others to learn more about the boundaries of the planet’s mantle.

“Caltech has lost a towering figure in geophysics with the passing of Don Anderson,” says Michael Gurnis, the John E. and Hazel S. Smits Professor of Geophysics. “Don left an indelible mark not just on the Seismological Laboratory but on the field of global seismology and whole-earth geophysics. He was an unusual scientist who often advocated unpopular theories and concepts. Perhaps more than anyone else, DLA—as he was fondly known in the Seismo Lab—knew how uncertain many of our observations and theories were, especially those of the earth’s deep interior. By advocating the unpopular, Don challenged our ideas and forced us to make the observations needed to resolve our understanding of earthquakes and the deep earth.”

In 1981, Anderson developed, with Adam Dziewonski of Harvard University, the Preliminary Reference Earth Model (PREM), a one-dimensional model representing the average properties of the earth, including seismic velocities, attenuation, and density, as a function of planetary radius. PREM continues to be the most widely used standard model of the earth. Anderson, a former president (1988–1990) of the American Geophysical Union, is the author of the book, Theory of the Earth, a 1989 reference on the origin, composition, and evolution of Earth’s interior. In 2007, Anderson published New Theory of the Earth, a completely updated version.

Born in Frederick, Maryland, on March 5, 1933, the son of a schoolteacher and an electrician, Anderson received his BS in geology and geophysics from Rensselaer Polytechnic University in 1955. He worked for Chevron Oil Company from 1955 to 1956, the Air Force Cambridge Research Center from 1956 to 1958, and the Arctic Institute of North America from 1958 to 1960.

His service with the Air Force took him to Greenland, where his job was to determine how thick the ice had to be to support aircraft that were in trouble. “The Air Force wanted their pilots to land disabled planes on the sea ice, but the conventional wisdom at the time was that they would break through the ice and the crew would freeze to death,” Anderson recalled in a 2001 oral history. Anderson and his colleagues found that, in fact, aircraft can land very easily on ice that is not very thick: “Even if the ice won’t support the plane while it’s sitting there, it will allow a plane to taxi long enough for the pilots to get out and then the plane can sink through the ice, or the wheels can poke through the ice. Our job was to study ice strength, and whether you could determine how strong it was before you landed so you would know where to land.” The project continued after Anderson entered graduate school at Caltech, where he earned a master’s degree in geophysics in 1959 and a doctorate in geophysics in 1962 under the supervision of Frank Press.

Upon his graduation from Caltech, Anderson was hired as a research fellow; he became an assistant professor in 1963, an associate professor in 1964, and a professor in 1968. From 1967 to 1989, Anderson was director of Caltech’s Seismological Laboratory.

“Those who were fortunate enough to be at the Seismo Lab with Don since the 1960s have greatly benefited from the interaction with him, and his influence will have long-lasting effects on our work for years to come,” says Hiroo Kanamori, the John E. and Hazel S. Smits Professor of Geophysics, Emeritus. “As many of the Seismo Lab alumni would testify, we all benefited tremendously from the Seismo Lab coffee break discussions where Don was always at the center. Occasionally, he forcefully presented his idea, but more often he was a good listener too. Then, we later received notes and reprints on the subjects discussed, and if we were really interested in the subject, we would pursue it in depth and eventually write an interesting paper. Many of my papers grew out of the coffee break discussions.”

“Don was a inspiring geoscientist who motivated his students and many younger colleagues to think deeply, broadly, and creatively about the Earth and other planets,” says Thomas Jordan (PhD ’72), a former student of Anderson’s who is now the William M. Keck Foundation Chair in Geological Sciences and professor of Earth sciences at the University of Southern California.

Provost Edward Stolper, the William E. Leonhard Professor of Geology and the Carl and Shirley Larson Provostial Chair, agreed, saying “Don had a significant impact on my career—both as a supportive, probing, and intellectually challenging colleague, and as a friend.

“As a graduate student at Harvard, I heard Don lecture about the possibility of there being a CAI-like zone around the earth’s core and about the composition of the moon. CAI’s are calcium-aluminum inclusions in chondritic meteorites and represent the very earliest solid materials formed in the solar system. I was energized by what Don had said and knew at that point I wanted to be at Caltech,” says Stolper.

Anderson was the Eleanor and John R. McMillan Professor from 1989 until his retirement in 2002.

“Don had a tremendous influence on the development of geophysics and global seismology in the United States,” says Gurnis, the current director of Caltech’s Seismological Laboratory. “One of DLA’s unwavering passions since the 1960s was to map the earth’s deep interior associated with surface processes. He was instrumental in founding the NSF-funded IRIS—Incorporated Research Institutions for Seismology—and the development of what became known as the GSN—the Global Seismic Network—in the 1980s. Through these major U.S. programs, we were able to map out the nature of the forces associated with plate tectonics and volcanism.”

Anderson continued to work and publish until his death. His most recent work on volcanism was showcased at the fall meeting of the American Geophysical Union in San Francisco in December 2014.

A fellow of the American Academy of Arts and Sciences and a member of the National Academy of Sciences and the American Philosophical Society, Anderson was also the recipient of the Emil Wiechert Medal of the German Geophysical Society, the Arthur L. Day Medal of the Geological Society of America, the Gold Medal of the Royal Astronomical Society, the William Bowie Medal of the American Geophysical Union, and the Crafoord Prize of the Royal Swedish Academy of Sciences.

In 1998, Anderson was awarded the National Medal of Science and was cited for his “immeasurable influence on the advancement of earth sciences over the past three decades nationally and internationally.”

Anderson is survived by his wife, Nancy; daughter, Lynn Rodriguez; son, Lee Anderson; and four granddaughters.

Caltech News tagged with “earthquakes”

Remembering Fredric Raichlen

By photosearth / May 25, 2015

1932 – 2014
News Writer: 
Douglas Smith

Fred Raichlen

Fred Raichlen, professor of civil and mechanical engineering at Caltech, in an undated photo.
Credit: Caltech Archives

Fredric (“Fred”) Raichlen, professor emeritus of civil and mechanical engineering in Caltech’s Division of Engineering and Applied Science, passed away on December 13, 2014. He was 82 years old. Raichlen was an expert in coastal engineering whose pioneering studies of tsunami mechanics have led to standards for designing tsunami-resistant structures that have saved lives around the world.

Ordinary waves are wind-driven and propagate at and just below the ocean’s surface. A tsunami, however, is driven by a displacement in the earth’s crust, such as an underwater earthquake or a volcanic eruption. The entire depth of the water column is set in motion from seafloor to surface. In the open ocean, the waves are hardly noticeable—the peaks are a few feet high at most, and the interval between successive waves can be several hours. But as the tsunami approaches land, the transition to shallow water concentrates the wave’s energy. This rising wall of water, focused by local topography, can flood many miles inland.

That much was known when Raichlen entered the field, says his graduate student Costas Synolakis (BS ’78, MS ’79, PhD ’86), now a professor of civil and environmental engineering at USC and the director of USC’s Tsunami Research Center. But, as Synolakis says, “There were several theories and hypotheses, but there was no laboratory validation of any of them. Further, there were very few field observations. Scientists did not even know what a tsunami looked like.”

This was at least partly because funding for tsunami research was hard to come by. Tsunamis were seen as a threat to other shorelines, not American ones. “Tsunamis were not trendy,” Synolakis says, “and their study was considered humdrum. For almost a decade, Fred was the only professor in the U.S. working on tsunami hydrodynamics. But the students he trained, trained others. And by the time it was realized how important tsunamis are, there were knowledgeable scientists who could rise to the challenge.”

Upon arriving at Caltech in 1962, Raichlen built a set of wave tanks to analyze how tsunamis originate, how they propagate through the open ocean, and what happens when they run up on shore. The data from these experiments enabled him to develop a comprehensive, three-dimensional computer model of tsunami behavior. The first part of the model described the waves’ motions through the deep sea, while the second part of the model described the waves’ behavior within the harbor. The two models were fused at the harbor’s entrance, with the connecting region modifying the incoming tsunami’s waves as they entered the harbor.

“The work he supervised remains the world standard,” Synolakis says. “Nobody else before or since has done laboratory experiments of such precision and quality. Fred believed that answers could only be mined in the laboratory and that the only numerical models that could be trusted were the ones that had been benchmarked with laboratory experiments.”

Previous models had represented harbors as simple geometric shapes. This model, however, re-created the harbor’s interior in great detail, rendering its basins, jetties, islands, and channels as collections of line segments. The waves’ reflections off of each line segment were easy to calculate when each segment was handled individually, and the tsunami’s actual behavior was derived by superimposing all the reflected waves on the incoming ones to map out where they would reinforce one another and where they would damp each other out. This approach reduced the computation to a straightforward exercise in matrix algebra that could be solved on Caltech’s IBM 360/75 mainframe computer—the fastest, most sophisticated machine of its day.

In 1965, Raichlen built a 31-by-15-foot wave tank instrumented to measure wave heights and water velocities anywhere within its walls. Graduate student Jiin-Jen Lee (PhD ’70), also now a professor of civil and environmental engineering at the University of Southern California and the director of USC’s Foundation for Cross-Connection Control and Hydraulic Research, used the tank to verify the model’s predictions of wave behavior in idealized circular and rectangular harbors. He then built a scale model of the east and west basins of the port of Long Beach, California, out of 15 sheets of quarter-inch-thick Lucite. The waves created by Lee’s physical model in the wave tank were well described by the mathematical model in the computer. Says Lee, “Fred wanted a theory and the numerical analysis to go with it, but he also wanted them verified against a physical model. A lot of people would just say, ‘OK, I did this, and now I’ll move on.’ Fred was very careful to make sure that the theory could actually be checked out.”

Raichlen continued to refine and expand the model. A third section was added to reproduce the different types of seabed motions that could give a wave its initial impetus. Other experiments considered a tsunami’s interactions with objects floating in the harbor, such as ships and mooring platforms, or measured how fast different regions within a wave moved as the wave broke, which allowed the force of the wave’s impact to be calculated.

Raichlen’s model also provided the first mathematically sound explanation of how seiches, also known as “harbor waves,” are created. Seiches can persist for days and are extremely damaging due to their height. They occur because every harbor has a set of resonant frequencies. Any waves of those frequencies will reverberate, amplifying themselves. Typical tsunamis have a frequency of one wave every several hours. Raichlen’s model showed that many harbors also have a fundamental resonant frequency of one wave every several hours—an unfortunate frequency match that enables such a harbor to amplify a tsunami into a seiche. The model also resolved a long-standing paradox: Harbors with narrow mouths usually offer the best shelter, but those same harbors suffer the worst seiches. The model showed that as the harbor’s mouth got narrower, the wave energy trapped within the harbor had less and less chance of escaping. The only way to dissipate the energy was by friction as the water sloshed back and forth.

Raichlen’s commitment to his work was matched by his commitment to his students. Lee’s thesis was published in the Journal of Fluid Mechanics, an unusual periodical for a civil engineer and one read by a much wider community. Says Lee, “Normally the professor and the student are coauthors, but Fred took his name out. He said, ‘This is very important for your career. You should publish it as the sole author.’ At first I thought that meant maybe the paper was not so good, and he didn’t want his name on it. But he wanted that study to be identified with me, so he gave me all the credit. I was really moved, because it was a pretty important study. We could have published a hundred papers, each with a different-shaped harbor.”

Raichlen was a hands-on adviser, spending time with each of his students every day, says Synolakis. “I will forever treasure how he trained me in the laboratory. I was a complete novice, and for several months, he stayed with me, making sure that I didn’t run into trouble. He was always eager to explain what we were seeing. His attention to detail was legendary, and he could see things that nobody else could or can.” 

Raichlen earned his bachelor’s degree in engineering from the Johns Hopkins University in 1953 and his master’s and doctoral degrees at MIT in 1955 and 1962.  He also served in the Air Force as an environmental health officer from 1956 to 1959. He came to Caltech as an assistant professor of civil engineering in 1962; he was promoted to associate professor in 1967 and to professor in 1972. In 1969, he became one of the founding faculty members of Caltech’s doctoral program in environmental engineering science. He was appointed professor of civil and mechanical engineering in 1997 and professor emeritus in 2001.

Raichlen was inducted into the National Academy of Engineering in 1993, and in 1994 he received the John G. Moffatt–Frank E. Nichol Harbor and Coastal Engineering Award from the American Society of Civil Engineers (ASCE). In 2003, he was given the ASCE’s International Coastal Engineering Award, the most prestigious honor in the international coastal engineering community.

In his retirement, Raichlen devoted his time to writing a book, Waves (MIT Press Essential Knowledge series, 2012). He also became an avid and prolific watercolor painter.

Raichlen is survived by his wife, Judy; his sons, Robert and David; their wives, Amy and Sarah (respectively); his sister, Linda Millison; his brother, Sonny; and two grandchildren. 

Caltech News tagged with “earthquakes”

New Research Suggests Solar System May Have Once Harbored Super-Earths

By photosearth / May 25, 2015

Caltech and UC Santa Cruz Researchers Say Earth Belongs to a Second Generation of Planets
News Writer: 
Kimm Fesenmaier

This snapshot from a new simulation depicts a time early in the solar system's history when Jupiter likely made a grand inward migration (here, Jupiter's orbit is the thick white circle). As it moved inward, Jupiter picked up primitive planetary building blocks, or planetesimals, and drove them into eccentric orbits (turquoise) that overlapped the unperturbed part of the planetary disk (yellow), setting off a cascade of collisions that would have ushered any interior planets into the sun.
Credit: K.Batygin/Caltech

Long before Mercury, Venus, Earth, and Mars formed, it seems that the inner solar system may have harbored a number of super-Earths—planets larger than Earth but smaller than Neptune. If so, those planets are long gone—broken up and fallen into the sun billions of years ago largely due to a great inward-and-then-outward journey that Jupiter made early in the solar system’s history.

This possible scenario has been suggested by Konstantin Batygin, a Caltech planetary scientist, and Gregory Laughlin of UC Santa Cruz in a paper that appears the week of March 23 in the online edition of the Proceedings of the National Academy of Sciences (PNAS). The results of their calculations and simulations suggest the possibility of a new picture of the early solar system that would help to answer a number of outstanding questions about the current makeup of the solar system and of Earth itself. For example, the new work addresses why the terrestrial planets in our solar system have such relatively low masses compared to the planets orbiting other sun-like stars.

“Our work suggests that Jupiter’s inward-outward migration could have destroyed a first generation of planets and set the stage for the formation of the mass-depleted terrestrial planets that our solar system has today,” says Batygin, an assistant professor of planetary science. “All of this fits beautifully with other recent developments in understanding how the solar system evolved, while filling in some gaps.”

Thanks to recent surveys of exoplanets—planets in solar systems other than our own—we know that about half of sun-like stars in our galactic neighborhood have orbiting planets. Yet those systems look nothing like our own. In our solar system, very little lies within Mercury’s orbit; there is only a little debris—probably near-Earth asteroids that moved further inward—but certainly no planets. That is in sharp contrast with what astronomers see in most planetary systems. These systems typically have one or more planets that are substantially more massive than Earth orbiting closer to their suns than Mercury does, but very few objects at distances beyond.

“Indeed, it appears that the solar system today is not the common representative of the galactic planetary census. Instead we are something of an outlier,” says Batygin. “But there is no reason to think that the dominant mode of planet formation throughout the galaxy should not have occurred here. It is more likely that subsequent changes have altered its original makeup.”

According to Batygin and Laughlin, Jupiter is critical to understanding how the solar system came to be the way it is today. Their model incorporates something known as the Grand Tack scenario, which was first posed in 2001 by a group at Queen Mary University of London and subsequently revisited in 2011 by a team at the Nice Observatory. That scenario says that during the first few million years of the solar system’s lifetime, when planetary bodies were still embedded in a disk of gas and dust around a relatively young sun, Jupiter became so massive and gravitationally influential that it was able to clear a gap in the disk. And as the sun pulled the disk’s gas in toward itself, Jupiter also began drifting inward, as though carried on a giant conveyor belt.

“Jupiter would have continued on that belt, eventually being dumped onto the sun if not for Saturn,” explains Batygin. Saturn formed after Jupiter but got pulled toward the sun at a faster rate, allowing it to catch up. Once the two massive planets got close enough, they locked into a special kind of relationship called an orbital resonance, where their orbital periods were rational—that is, expressible as a ratio of whole numbers. In a 2:1 orbital resonance, for example, Saturn would complete two orbits around the sun in the same amount of time that it took Jupiter to make a single orbit. In such a relationship, the two bodies would begin to exert a gravitational influence on one another.

“That resonance allowed the two planets to open up a mutual gap in the disk, and they started playing this game where they traded angular momentum and energy with one another, almost to a beat,” says Batygin. Eventually, that back and forth would have caused all of the gas between the two worlds to be pushed out, a situation that would have reversed the planets’ migration direction and sent them back outward in the solar system. (Hence, the “tack” part of the Grand Tack scenario: the planets migrate inward and then change course dramatically, something like a boat tacking around a buoy.)

In an earlier model developed by Bradley Hansen at UCLA, the terrestrial planets conveniently end up in their current orbits with their current masses under a particular set of circumstances—one in which all of the inner solar system’s planetary building blocks, or planetesimals, happen to populate a narrow ring stretching from 0.7 to 1 astronomical unit (1 astronomical unit is the average distance from the sun to Earth), 10 million years after the sun’s formation. According to the Grand Tack scenario, the outer edge of that ring would have been delineated by Jupiter as it moved toward the sun on its conveyor belt and cleared a gap in the disk all the way to Earth’s current orbit.

But what about the inner edge? Why should the planetesimals be limited to the ring on the inside? “That point had not been addressed,” says Batygin.

He says the answer could lie in primordial super-Earths. The empty hole of the inner solar system corresponds almost exactly to the orbital neighborhood where super-Earths are typically found around other stars. It is therefore reasonable to speculate that this region was cleared out in the primordial solar system by a group of first-generation planets that did not survive.

Batygin and Laughlin’s calculations and simulations show that as Jupiter moved inward, it pulled all the planetesimals it encountered along the way into orbital resonances and carried them toward the sun. But as those planetesimals got closer to the sun, their orbits also became elliptical. “You cannot reduce the size of your orbit without paying a price, and that turns out to be increased ellipticity,” explains Batygin. Those new, more elongated orbits caused the planetesimals, mostly on the order of 100 kilometers in radius, to sweep through previously unpenetrated regions of the disk, setting off a cascade of collisions among the debris. In fact, Batygin’s calculations show that during this period, every planetesimal would have collided with another object at least once every 200 years, violently breaking them apart and sending them decaying into the sun at an increased rate.

The researchers did one final simulation to see what would happen to a population of super-Earths in the inner solar system if they were around when this cascade of collisions started. They ran the simulation on a well-known extrasolar system known as Kepler-11, which features six super-Earths with a combined mass 40 times that of Earth, orbiting a sun-like star. The result? The model predicts that the super-Earths would be shepherded into the sun by a decaying avalanche of planetesimals over a period of 20,000 years.

“It’s a very effective physical process,” says Batygin. “You only need a few Earth masses worth of material to drive tens of Earth masses worth of planets into the sun.”

Batygin notes that when Jupiter tacked around, some fraction of the planetesimals it was carrying with it would have calmed back down into circular orbits. Only about 10 percent of the material Jupiter swept up would need to be left behind to account for the mass that now makes up Mercury, Venus, Earth, and Mars.

From that point, it would take millions of years for those planetesimals to clump together and eventually form the terrestrial planets—a scenario that fits nicely with measurements that suggest that Earth formed 100–200 million years after the birth of the sun. Since the primordial disk of hydrogen and helium gas would have been long gone by that time, this could also explain why Earth lacks a hydrogen atmosphere. “We formed from this volatile-depleted debris,” says Batygin.

And that sets us apart in another way from the majority of exoplanets. Batygin expects that most exoplanets—which are mostly super-Earths—have substantial hydrogen atmospheres, because they formed at a point in the evolution of their planetary disk when the gas would have still been abundant. “Ultimately, what this means is that planets truly like Earth are intrinsically not very common,” he says.

The paper also suggests that the formation of gas giant planets such as Jupiter and Saturn—a process that planetary scientists believe is relatively rare—plays a major role in determining whether a planetary system winds up looking something like our own or like the more typical systems with close-in super-Earths. As planet hunters identify additional systems that harbor gas giants, Batygin and Laughlin will have more data against which they can check their hypothesis—to see just how often other migrating giant planets set off collisional cascades in their planetary systems, sending primordial super-Earths into their host stars.

 The researchers describe their work in a paper titled “Jupiter’s Decisive Role in the Inner Solar System’s Early Evolution.”

Caltech News tagged with “GPS”

An Earthquake Warning System in Our Pockets?

By photosearth / May 25, 2015

Researchers Test Smartphones for Advance-Notice System
News Writer: 
Kimm Fesenmaier

Credit: iStock

While you are checking your email, scrolling through social-media feeds, or just going about your daily life with your trusty smartphone in your pocket, the sensors in that little computer could also be contributing to an earthquake early warning system. So says a new study led by researchers at Caltech and the United States Geological Survey (USGS). The study suggests that all of our phones and other personal electronic devices could function as a distributed network, detecting any ground movements caused by a large earthquake, and, ultimately, giving people crucial seconds to prepare for a temblor.

“Crowd-sourced alerting means that the community will benefit by data generated by the community,” said Sarah Minson (PhD ’10), a USGS geophysicist and lead author of the study, which appears in the April 10 issue of the new journal Science Advances. Minson completed the work while a postdoctoral scholar at Caltech in the laboratory of Thomas Heaton, professor of engineering seismology.

Earthquake early warning (EEW) systems detect the start of an earthquake and rapidly transmit warnings to people and automated systems before they experience shaking at their location. While much of the world’s population is susceptible to damaging earthquakes, EEW systems are currently operating in only a few regions around the globe, including Japan and Mexico. “Most of the world does not receive earthquake warnings mainly due to the cost of building the necessary scientific monitoring networks,” says USGS geophysicist and project lead Benjamin Brooks.

Despite being less accurate than scientific-grade equipment, the GPS receivers in smartphones are sufficient to detect the permanent ground movement, or displacement, caused by fault motion in earthquakes that are approximately magnitude 7 and larger. And, of course, they are already widely distributed. Once displacements are detected by participating users’ phones, the collected information could be analyzed quickly in order to produce customized earthquake alerts that would then be transmitted back to users.

“Thirty years ago it took months to assemble a crude picture of the deformations from an earthquake. This new technology promises to provide a near-instantaneous picture with much greater resolution,” says Heaton, a coauthor of the new study.

In the study, the researchers tested the feasibility of crowd-sourced EEW with a simulation of a hypothetical magnitude 7 earthquake, and with real data from the 2011 magnitude 9 Tohoku-oki, Japan earthquake. The results show that crowd-sourced EEW could be achieved with only a tiny percentage of people in a given area contributing information from their smartphones. For example, if phones from fewer than 5,000 people in a large metropolitan area responded, the earthquake could be detected and analyzed fast enough to issue a warning to areas farther away before the onset of strong shaking.

The researchers note that the GPS receivers in smartphones and similar devices would not be sufficient to detect earthquakes smaller than magnitude 7, which could still be potentially damaging. However, smartphones also have microelectromechanical systems (MEMS) accelerometers that are capable of recording any earthquake motions large enough to be felt; this means that smartphones may be useful in earthquakes as small as magnitude 5. In a separate project, Caltech’s Community Seismic Network Project has been developing the framework to record and utilize data from an inexpensive array of such MEMS accelerometers.

Comprehensive EEW requires a dense network of scientific instruments. Scientific-grade EEW, such as the USGS’s ShakeAlert system that is currently being implemented on the west coast of the United States, will be able to help minimize the impact of earthquakes over a wide range of magnitudes. However, in many parts of the world where there are insufficient resources to build and maintain scientific networks but consumer electronics are increasingly common, crowd-sourced EEW has significant potential.

“The U.S. earthquake early warning system is being built on our high-quality scientific earthquake networks, but crowd-sourced approaches can augment our system and have real potential to make warnings possible in places that don’t have high-quality networks,” says Douglas Given, USGS coordinator of the ShakeAlert Earthquake Early Warning System. The U.S. Agency for International Development has already agreed to fund a pilot project, in collaboration with the Chilean Centro Sismólogico Nacional, to test a pilot hybrid earthquake warning system comprising stand-alone smartphone sensors and scientific-grade sensors along the Chilean coast.

“Crowd-sourced data are less precise, but for larger earthquakes that cause large shifts in the ground surface, they contain enough information to detect that an earthquake has occurred, information necessary for early warning,” says study coauthor Susan Owen of JPL.

Additional coauthors on the paper, “Crowdsourced earthquake early warning,” are from the USGS, Carnegie Mellon University–Silicon Valley, and the University of Houston. The work was supported in part by the Gordon and Betty Moore Foundation, the USGS Innovation Center for Earth Sciences, and the U.S. Department of Transportation Office of the Assistant Secretary for Research and Technology.

Caltech News tagged with “GPS”

Chemists Create “Comb” that Detects Terahertz Waves with Extreme Precision

By photosearth / May 25, 2015

News Writer: 
Kimm Fesenmaier

Caltech chemists have developed a precise ruler of terahertz light that will aid in the study of organic molecules in space, and the soft interactions between molecules in water. Due to its resemblance to a hair comb, the ruler is called a terahertz frequency comb,
Credit: Lance Hayashida/Caltech and NASA/ESA/ and the Hubble Heritage Team (STScI/AURA) – ESA/Hubble Collaboration

Light can come in many frequencies, only a small fraction of which can be seen by humans. Between the invisible low-frequency radio waves used by cell phones and the high frequencies associated with infrared light lies a fairly wide swath of the electromagnetic spectrum occupied by what are called terahertz, or sometimes submillimeter, waves. Exploitation of these waves could lead to many new applications in fields ranging from medical imaging to astronomy, but terahertz waves have proven tricky to produce and study in the laboratory. Now, Caltech chemists have created a device that generates and detects terahertz waves over a wide spectral range with extreme precision, allowing it to be used as an unparalleled tool for measuring terahertz waves.

The new device is an example of what is known as a frequency comb, which uses ultrafast pulsed lasers, or oscillators, to produce thousands of unique frequencies of radiation distributed evenly across a spectrum like the teeth of a comb. Scientists can then use them like rulers, lining up the teeth like tick marks to very precisely measure light frequencies. The first frequency combs, developed in the 1990s, earned their creators (John Hall of JILA and Theordor Hánsch of the Max Planck Institute of Quantum Optics and Ludwig Maximilians University Munich) the 2005 Nobel Prize in physics. These combs, which originated in the visible part of the spectrum, have revolutionized how scientists measure light, leading, for example, to the development of today’s most accurate timekeepers, known as optical atomic clocks.

The team at Caltech combined commercially available lasers and optics with custom-built electronics to extend this technology to the terahertz, creating a terahertz frequency comb with an unprecedented combination of spectral coverage and precision. Its thousands of “teeth” are evenly spaced across the majority of the terahertz region of the spectrum (0.15-2.4 THz), giving scientists a way to simultaneously measure absorption in a sample at all of those frequencies.

The work is described in a paper that appears in the online version of the journal Physical Review Letters and will be published in the April 24 issue. The lead author is graduate student and National Science Foundation fellow Ian Finneran, who works in the lab of Geoffrey A. Blake, professor of cosmochemistry and planetary sciences and professor of chemistry at Caltech.

Blake explains the utility of the new device, contrasting it with a common radio tuner. “With radio waves, most tuners let you zero in on and listen to just one station, or frequency, at a time,” he says. “Here, in our terahertz approach, we can separate and process more than 10,000 frequencies all at once. In the near future, we hope to bump that number up to more than 100,000.”

That is important because the terahertz region of the spectrum is chock-full of information. Everything in the universe that is warmer than about 10 degrees Kelvin (-263 degrees Celsius) gives off terahertz radiation. Even at these very low temperatures molecules can rotate in space, yielding unique fingerprints in the terahertz. Astronomers using telescopes such as Caltech’s Submillimeter Observatory, the Atacama Large Millimeter Array, and the Herschel Space Observatory are searching stellar nurseries and planet-forming disks at terahertz frequencies, looking for such chemical fingerprints to try to determine the kinds of molecules that are present and thus available to planetary systems. But in just a single chunk of the sky, it would not be unusual to find signatures of 25 or more different molecules.

To be able to definitively identify specific molecules within such a tangle of terahertz signals, scientists first need to determine exact measurements of the chemical fingerprints associated with various molecules. This requires a precise source of terahertz waves, in addition to a sensitive detector, and the terahertz frequency comb is ideal for making such measurements in the lab.

“When we look up into space with terahertz light, we basically see this forest of lines related to the tumbling motions of various molecules,” says Finneran. “Unraveling and understanding these lines is difficult, as you must trek across that forest one point and one molecule at a time in the lab. It can take weeks, and you would have to use many different instruments. What we’ve developed, this terahertz comb, is a way to analyze the entire forest all at once.”

After the device generates its tens of thousands of evenly spaced frequencies, the waves travel through a sample—in the paper, the researchers provide the example of water vapor. The instrument then measures what light passes through the sample and what gets absorbed by molecules at each tooth along the comb. If a detected tooth gets shorter, the sample absorbed that particular terahertz wave; if it comes through at the baseline height, the sample did not absorb at that frequency.

“Since we know exactly where each of the tick marks on our ruler is to about nine digits, we can use this as a diagnostic tool to get these frequencies really, really precisely,” says Finneran. “When you look up in space, you want to make sure that you have such very exact measurements from the lab.”

In addition to the astrochemical application of identifying molecules in space, the terahertz comb will also be useful for studying fundamental interactions between molecules. “The terahertz is unique in that it is really the only direct way to look not only at vibrations within individual large molecules that are important to life, but also at vibrations between different molecules that govern the behavior of liquids such as water,” says Blake.

Additional coauthors on the paper, “Decade-Spanning High-Precision Terahertz Frequency Comb,” include current Caltech graduate students Jacob Good, P. Brandon Carroll, and Marco Allodi, as well as recent graduate Daniel Holland (PhD ’14). The work was supported by funding from the National Science Foundation.

Caltech News tagged with “GPS”

Caltech’s Linde Center Helps Navigate the Southern Ocean

By photosearth / May 25, 2015

Credit: Hannah Joy-Warren, Stanford graduate student, taken during the Phantastic II cruise to the west Antarctic Peninsula (October/November 2014).

At Caltech’s Ronald and Maxine Linde Center for Global Environmental Science, researchers from diverse disciplines work together to investigate Earth’s climate and its atmosphere, oceans, and biosphere; their evolution; and how they may change in the future.

In early February, the center hosted a three-day workshop focused on the Southern Ocean around Antarctica. Scientists from around the world working at the intersection of fluid dynamics and biochemistry gathered to summarize our current knowledge of the physical, chemical, and biological processes that are critical to the Southern Ocean’s circulation and marine ecosystems. The researchers set out to identify areas where collaboration across disciplines is needed to push that understanding forward. Here are a few of the topics they covered.

The Use of Autonomous Underwater Vehicles for Observation 

Credit: Sunke Schmidtko

The Southern Ocean is one of the most inhospitable places on Earth. Despite the area’s importance to the global climate, measurements and data are hard to come by because it is difficult to deploy research vessels in the region, especially in winter. Little, if any, data have been collected in some areas, especially in the deep ocean and underneath ice shelves.

But many new tools now exist to improve data collection and measurement in these remote regions. Autonomous gliders (shown above) have gathered information on currents, water density, and temperature at many depths, helping researchers like workshop participants Nicole Couto (Rutgers University), Mike Meredith (British Antarctic Survey), as well as Caltech’s Andrew Thompson, assistant professor of environmental science and engineering, understand how warm waters are causing ice sheets to melt. Meanwhile, an extensive system of autonomous floats monitors temperature, salinity, dissolved gases and currents in the earth’s oceans; moored instruments track what is happening beneath ice shelves; and even Antarctic seals outfitted with sensors provide scientists access to, and information about, some of the ocean’s coldest and most inaccessible waters.

Iron Limitation on Phytoplankton Growth

Credit: NASA/Suomi NPP/Norman Kuring

Phytoplankton, microscopic algae that perform photosynthesis, the base of the Southern Ocean food web. These organisms require both nutrients and sunlight to survive. The Southern Ocean is a region where nutrients and sunlight (at least in summer) are plentiful, yet many parts of the Southern Ocean have extremely low phytoplankton concentrations. This is because not all nutrients are treated equally. Take iron, for example. Although iron is needed only in small amounts by phytoplankton, it is scarce throughout most of the Southern Ocean. Iron enters ocean waters by way of dust falling out of the atmosphere, from melting icebergs or glaciers, and from the ocean floor. Meeting participants Phil Boyd (University of Tasmania) and Nicolas Cassar (Duke University) are working to understand how sources of iron will respond to changing atmospheric and oceanic conditions, as well as how Southern Ocean ecosystems will adapt, are important research questions.

Phytoplankton distributions are largely observed by measuring ocean color from space. This image shows data from NASA’s MODIS (MODerate resolution Imaging Spectroradiometer) satellite, which measures light coming off the ocean, NASA scientists use this information to determine the concentration of phytoplankton in the water. Here, yellow and orange colors indicate the presence of more phytoplankton.

The Importance of High Spatial Resolution in Ocean Models

Credit: Jeff Schmalz/NASA

The ocean is similar to the atmosphere in that much of the variability is contained in “weather systems,” or high- and low-pressure areas. These weather systems create swirling currents, called eddies, that are the ocean equivalent of atmospheric storms. While storms in the atmosphere span hundreds of kilometers, eddies in the ocean only cover a few tens of kilometers. When numerical models, such as those run by meeting participant Andy Hogg (Australia National University), capture these smaller scales, the simulations explode with previously unseen dynamics and produce an energetic circulation that is more vigorous than seen in models that only simulate larger scales.

This image of Chatham Island, off the coast of New Zealand, was taken by MODIS. The blue wispy pattern (upper right) is a phytoplankton bloom that is being stretched and stirred by ocean eddies. Images like this one verify that high-resolution numerical models accurately reproduce oceanic motions and provide insight into how these small-scale currents influence Southern Ocean ecosystems.

Heat Input

Credit: Courtesy of Whit Anderson/The Geophysical Fluid Dynamics Lab in Princeton, NJ

Increasing carbon dioxide concentrations in the atmosphere warm the planet, with roughly 90 percent of the extra energy going into the oceans. The ocean warming that results is not uniform around the globe. Numerical models from the group of meeting participant John Marshall (MIT) suggest that the warming of the Southern Ocean will occur later than that of other oceans. The reason? The Southern Ocean provides a gateway where cold, dense waters, stored in the deep ocean, are brought up to the surface by the ocean circulation and are exposed to the atmosphere. These cold waters have the potential to store a large amount of heat. Understanding when this reservoir will be exhausted is critical to predicting future Southern Ocean temperature changes.

In this sea-surface temperature map created by a NOAA Geophysical Fluid Dynamics Laboratory model, Southern Ocean waters (green and blue) represent regions where cold water rises up to the surface, warms, and moves northward.

The Distribution of Sea Ice

Credit: Hannah Joy-Warren, Stanford graduate student, taken during the Phantastic II cruise to the west Antarctic Peninsula (October/November 2014).

The distribution of sea ice in the Southern Ocean is important for many reasons. For instance, sea ice can act as a cap on the ocean, limiting atmospheric interactions with the ocean surface that may trap carbon in the deep ocean. Recently, Caltech researchers including Thompson and Jess Adkins, professor of geochemistry and global environmental science, discovered a link between the distribution of sea ice in the Southern Ocean and differences in the ocean circulation in our present climate and at the Last Glacial Maximum.

As sea ice retreats, additional melting can be a source of iron to the ocean, influencing phytoplankton growth. The capacity for plankton and other organisms to survive the Antarctic winter is only just beginning to be understood, as explained in a recent review article on sea ice ecosystems by meeting participant Kevin Arrigo (Stanford University). Future under-ice observations are needed to improve our ability to estimate ecosystem changes in polar regions.

Caltech News tagged with “GPS”

Tracking Photosynthesis from Space

By photosearth / May 25, 2015

News Writer: 
Jessica Stoller-Conrad

Artistic representation of an OCO-2 orbit track, covering vegetated areas and measuring Solar Induced Fluorescence (SIF). OCO-2 will enable measurements within a narrow swath with sufficient accuracy for 5km x 5km spatial scales—an unprecedented resolution for SIF measurements from space. The measurement technique itself is also robust against occurrences of atmospheric aerosols and thin clouds, which typically affect purely optical-based remote sensing techniques.
Credit: NASA/JPL-Caltech and adapted by Lance Hayashida/Caltech

Watching plants perform photosynthesis from space sounds like a futuristic proposal, but a new application of data from NASA’s Orbiting Carbon Observatory-2 (OCO-2) satellite may enable scientists to do just that. The new technique, which allows researchers to analyze plant productivity from far above Earth, will provide a clearer picture of the global carbon cycle and may one day help researchers determine the best regional farming practices and even spot early signs of drought.

When plants are alive and healthy, they engage in photosynthesis, absorbing sunlight and carbon dioxide to produce food for the plant, and generating oxygen as a by-product. But photosynthesis does more than keep plants alive. On a global scale, the process takes up some of the man-made emissions of atmospheric carbon dioxide—a greenhouse gas that traps the sun’s heat down on Earth—meaning that plants also have an important role in mitigating climate change.

To perform photosynthesis, the chlorophyll in leaves absorbs sunlight—most of which is used to create food for the plants or is lost as heat. However, a small fraction of that absorbed light is reemitted as near-infrared light. We cannot see in the near-infrared portion of the spectrum with the naked eye, but if we could, this reemitted light would make the plants appear to glow—a property called solar induced fluorescence (SIF). Because this reemitted light is only produced when the chlorophyll in plants is also absorbing sunlight for photosynthesis, SIF can be used as a way to determine a plant’s photosynthetic activity and productivity.

“The intensity of the SIF appears to be very correlated with the total productivity of the plant,” says JPL scientist Christian Frankenberg, who is lead for the SIF product and will join the Caltech faculty in September as an associate professor of environmental science and engineering in the Division of Geological and Planetary Sciences.

Usually, when researchers try to estimate photosynthetic activity from satellites, they utilize a measure called the greenness index, which uses reflections in the near-infrared spectrum of light to determine the amount of chlorophyll in the plant. However, this is not a direct measurement of plant productivity; a plant that contains chlorophyll is not necessarily undergoing photosynthesis. “For example,” Frankenberg says, “evergreen trees are green in the winter even when they are dormant.”

He adds, “When a plant starts to undergo stress situations, like in California during a summer day when it’s getting very hot and dry, the plants still have chlorophyll”—chlorophyll that would still appear to be active in the greenness index—”but they usually close the tiny pores in their leaves to reduce water loss, and that time of stress is also when SIF is reduced. So photosynthesis is being very strongly reduced at the same time that the fluorescence signal is also getting weaker, albeit at a smaller rate.”

The Caltech and JPL team, as well as colleagues from NASA Goddard, discovered that they could measure SIF from orbit using spectrometers—standard instruments that can detect light intensity—that are already on board satellites like Japan’s Greenhouse Gases Observing Satellite (GOSAT) and NASA’s OCO-2.

In 2014, using this new technique with data from GOSAT and the European Global Ozone Monitoring Experiment–2 satellite, the researchers scoured the globe for the most productive plants and determined that the U.S. “Corn Belt”—the farming region stretching from Ohio to Nebraska—is the most photosynthetically active place on the planet. Although it stands to reason that a cornfield during growing season would be actively undergoing photosynthesis, the high-resolution measurements from a satellite enabled global comparison to other plant-heavy regions—such as tropical rainforests.

“Before, when people used the greenness index to represent active photosynthesis, they had trouble determining the productivity of very dense plant areas, such as forests or cornfields. With enough green plant material in the field of view, these greenness indexes can saturate; they reach a maximum value they can’t exceed,” Frankenberg says. Because of the sensitivity of the SIF measurements, researchers can now compare the true productivity of fields from different regions without this saturation—information that could potentially be used to compare the efficiency of farming practices around the world.

Now that OCO-2 is online and producing data, Frankenberg says that it is capable of achieving higher resolution than the preliminary experiments with GOSAT. Therefore, OCO-2 will be able to provide an even clearer picture of plant productivity worldwide. However, to get more specific information about how plants influence the global carbon cycle, an evenly distributed ground-based network of spectrometers will be needed. Such a network—located down among the plants rather than miles above—will provide more information about regional uptake of carbon dioxide via photosynthesis and the mechanistic link between SIF and actual carbon exchange.

One existing network, called FLUXNET, uses ground-based towers to measure the exchange of carbon dioxide, or carbon flux, between the land and the atmosphere from towers at more than 600 locations worldwide. However, the towers only measure the exchange of carbon dioxide and are unable to directly observe the activities of the biosphere that drive this exchange.

The new ground-based measurements will ideally take place at existing FLUXNET sites, but they will be performed with a small set of high-resolution spectrometers—similar to the kind that OCO-2 uses—to allow the researchers to use the same measurement principles they developed for space. The revamped ground network was initially proposed in a 2012 workshop at the Keck Institute for Space Studies and is expected to go online sometime in the next two years.

In the future, a clear picture of global plant productivity could influence a range of decisions relevant to farmers, commodity traders, and policymakers. “Right now, the SIF data we can gather from space is too coarse of a picture to be really helpful for these conversations, but, in principle, with the satellite and ground-based measurements you could track the fluorescence in fields at different times of day,” he says. This hourly tracking would not only allow researchers to detect the productivity of the plants, but it could also spot the first signs of plant stress—a factor that impacts crop prices and food security around the world.

“The measurements of SIF from OCO-2 greatly extend the science of this mission”, says Paul Wennberg, R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering, director of the Ronald and Maxine Linde Center for Global Environmental Science, and a member of the OCO-2 science team. “OCO-2 was designed to map carbon dioxide, and scientists plan to use these measurements to determine the underlying sources and sinks of this important gas. The new SIF measurements will allow us to diagnose the efficiency of the plants—a key component of the sinks of carbon dioxide.”

By using OCO-2 to diagnose plant activity around the globe, this new research could also contribute to understanding the variability in crop primary productivity and also, eventually, the development of technologies that can improve crop efficiency—a goal that could greatly benefit humankind, Frankenberg says.

This project is funded by the Keck Institute for Space Studies and JPL. Wennberg is also an executive officer for the Environmental Science and Engineering (ESE) program. ESE is a joint program of the Division of Engineering and Applied Science, the division of Chemistry and Chemical Engineering, and the Division of Geological and Planetary Sciences.

Caltech News tagged with “GPS”

Page 28 of 29