Scientists have become so adept at constructing computer models of complex phenomena such as earthquakes, global climate and the human heart -- and the computers they use have become so powerful -- that they can run the models standing on their heads.
Computationally speaking, that is.
It is what is sometimes called "the inverse problem." Rather than using the computer model to predict the future behavior of a system, as is typically done, solving the inverse problem means taking the behavior of a system and then working backward to determine what led to that behavior.
The approach already is being used in Europe to enhance the accuracy of multi-day weather forecasts. Homeland security researchers are investigating whether computer models can be used to trace back the source of a chemical weapons attack.
And researchers at Carnegie Mellon University and the Pittsburgh Supercomputing Center last month won the Gordon Bell Prize, a prestigious award for high-performance computing, for their work on earthquake simulations. Part of that work involved the inverse problem -- using the surface motion of earthquakes to determine subsurface geology.
"It's a tough problem," said Thomas Jordan, director of the Southern California Earthquake Center. "It's a very important problem." Not only can this inverse method tell scientists more about the geology of a region, but that geological data can subsequently be used in computer models to improve earthquake predictions.
Making the inverse problem difficult is that it requires many times more calculations than a standard, "forward" model and some additional computational tricks.
"It's not literally running the model backward," emphasized Chris Davis, an atmospheric scientist at the National Center for Atmospheric Research in Boulder, Colo. Though performing the meteorological version of the inverse problem might add "a half-day of forecasting skill" to a three-to-five day forecast, he noted, U.S. forecasters thus far have forgone this process because of the large amounts of computing time it requires.
"If computer resources were not an issue, everybody would be doing it," Davis said.
Ever since CMU and the supercomputing center began their earthquake modeling efforts a decade ago, performing the inverse problem "was kind of a dream," said Jacobo Bielak, professor of civil and environmental engineering. But it has taken longer than expected to make that dream reality, he acknowledged, because "we just weren"t aware of how hard the forward problem would be."
Computer models attempt to translate physical phenomena into mathematical equations. Success depends not only on picking the right set of equations, but also on obtaining a large quantity of accurate information about the initial conditions of a system, be it a geologic basin or the atmosphere, and on using a computer big enough and fast enough to solve a mountain of equations within a reasonable amount of time.
Earthquake simulation was considered one of the Grand Challenges of scientific computing a decade ago by the National Science Foundation, which sponsored the early work in Pittsburgh. It was both an important problem and one that would strain the capabilities of the world's fastest computers.
In 1993, the researchers had access to machines capable of billions of calculations per second, recalled Omar Ghattas, a CMU engineer and one of the principals in the Quake Project. That was a good start, but was hardly adequate to the job. Only since the installation two years ago of LeMieux, the supercomputing center's 3,000-processor computer capable of trillions of calculations per second, is the computing capability close to what is necessary for meaningful simulations of Los Angeles earthquakes.
"You couldn"t even think of the inverse problem in the past," he added.
The Quake Project has tried to find models that explain the great variation, often within a small area, of earthquake ground motion in the L.A. basin. Following the 1994 Northridge earthquake, for instance, collapsed buildings could be found within blocks of similar buildings that stood undamaged. Much of the variation appears to be caused by differences in soil types and the underlying geological structure.
Using a number of modeling tricks and the power of LeMieux, the group now is able to perform earthquake simulations of the L.A. basin measuring 100 kilometers square and 50 kilometers deep. The model requires assessing soil and rock conditions at 10 meter intervals throughout that space.
Even so, these simulations are relevant only to the performance of buildings five stories or taller. To gauge the effects of earthquakes on homes, apartment buildings and office buildings of under than five stories, the researchers would need to simulate seismic waves at shorter wavelengths. Not only would that require a computer 10 or 20 times faster than LeMieux, but it would require calculating ground conditions at intervals of just one meter.
Simply obtaining information about the soil and rocks at one meter intervals is a nearly impossible task, Ghattas said. But that"s one reason to consider the inverse problem -- using ground motion to determine the geological conditions of the basin might be one way to obtain that data.
Jordan, of the Southern California Earthquake Center, said a number of research groups have produced earthquake models and have high confidence that the models do a good job of mimicking seismic conditions. But no one has ever been able to use a model to precisely duplicate the ground motion of the Northridge quake or other historic earthquakes. Researchers suspect the problem is that their incomplete knowledge of subsurface geology might be one reason why, he noted. A number of researchers besides the CMU team thus are pursuing the inverse problem because they hope solving it will ultimately improve the performance of all of their computer models, Jordan said.
Volkan Akcelik, a post-doctoral researcher at CMU and a member of the Quake Project, spent last summer at Sandia National Laboratories in Albuquerque, N.M., helping to apply these inverse techniques to the analysis of chemical terrorism.
At issue, he explained, is determining where airborne toxic chemicals might go if released in an urban area by terrorists. The hope is that by taking measurements at a number of locations, investigators could use the inverse technique to calculate the original location and concentration of the chemical. With that information in hand, Akcelik explained, it would be easier to calculate where the chemical might be spread and which areas of a city might need to be evacuated. The inverse techniques seem to work, he said -- provided that air current patterns in the area are known. Air currents ultimately might also be calculated using the inverse techniques.
"We need to do more work on this," he added.
First Published: December 15, 2003, 5:00 a.m.