ALBUQUERQUE, N.M. — Four hours, 17 minutes, 46 seconds – less time than it takes most Americans to figure their taxes.
That’s how long Sandia National Laboratories’ 1,840-node Intel Paragon supercomputer, among the world’s fastest-operating computers, worked recently on a three-dimensional data set with millions of unknowns to arrive at an answer to an oozy problem – how far and fast, and in which direction, liquid waste migrates in a particular subterranean environment.
Sandia researchers David Alumbaugh and Greg Newman have been working on the complex problem of three-dimensional underground imaging based on subsurface electrical properties for almost two years. They say the answers the Paragon is providing may eventually lead to 3-D computer images that would help environmental remediators clean up underground contamination, hydrologists explore aquifers, mining companies map boundaries of mineral deposits, and petroleum companies site wells for maximum oil extraction.
“In the past, we’ve been confined to stacking a series of two-dimensional analyses on top of one another to get a three-dimensional picture of an underground environment, like a CAT scan,” says Alumbaugh. “With the supercomputer, we’ve shown we can take a data set with millions of unknowns and convert it to a useful three-dimensional model that tells us what’s going on down there . . . It’s more precise than a 2-D model.”
A speedy worker
The Paragon is such a fast worker because, unlike a typical desktop workstation that solves problems one at a time, it breaks up huge computational problems into thousands of smaller tasks and works on all the tasks simultaneously with its hundreds of processors. The technique is called massively parallel computing or, more specifically, multiple instruction, multiple data (MIMD) computing.
Alumbaugh helped gather the three-dimensional data for the experiment four years ago as part of his Ph.D. thesis while at the University of California at Berkeley. Five 60-meter-deep wells were drilled at a site at UC’s Richmond Field Station, one at each of four corners of a roughly 45-meter square and one at the center. Then, through the center well, 50,000 gallons of salt water were injected into a gravel aquifer 30 meters below the surface to simulate a liquid waste plume.
As a magnetic dipole source was lowered into the center well (emitting an 18 kiloHertz sinusoidal electromagnetic wave), readings of the magnetic field’s strength and direction were taken at various depths from the other four wells. Because the salt water plume’s electrical conductivity differed from that of the surrounding aquifer, the readings contained information about how far and in what direction the plume had migrated from the center well, as well as about the aquifer’s porosity.
Similar readings were taken before and after the injection. Each set of readings provided some 924 data points. (The Department of Energy and a consortium of oil and mining companies funded the data-collection project.)
But the recent Sandia computations were more an experiment in computer modeling than an environmental remediation exercise, says Alumbaugh. The Berkeley data were used because “it was the only good 3-D data set available,” he says.
Newman and Alumbaugh spent the first year of the two-year project modifying existing modeling codes for massively parallel computing. The second year was spent preparing imaging codes for the three-dimensional data set. Ironically, arriving at an answer on the Paragon required only a few hours.
Cell groups assigned to processors
To make the data set more palatable to the Paragon, the researchers divided the underground volume into thousands of cubic cells, each cell a 2-meter (6 1/2-foot) cube. The result was a cell mesh 40 cells wide, 40 cells long, and 50 cells deep. The mesh was then divided into groups of cells, each of which was assigned its own processor.
Next, the researchers set out to develop ways of reading and writing data to and from the processors and across processor boundaries. “Data from each cell interacts with data from all the cell cubes adjacent to it – along its six faces and 12 edges,” Newman explains. “For each of the 80,000 cells, we had to define how and when that cell communicated with its neighbors on different processors.”
The result of two years of preparation and less than five hours of computing was three pairs of 3-D color images — one pair illustrating the aquifer’s electrical conductivity before salt-water injection, one pair showing it after injection, and the third pair highlighting the before-and-after conductivity differences.
Alumbaugh says the complexity level of 3-D underground waste migration modeling is “right up there” with modeling fluid dynamics, the “climbing Mt. Everest” of computing.
The modeling effort paid off in several ways. “We found that the hydrologists who had analyzed the Berkeley data were 90 degrees off about which direction the plume was migrating,” Alumbaugh says.
Hydrologists had assumed an eastward flow based on limited well measurements, but the Paragon images clearly indicated a northward flow, probably owing to a complicated aquifer permeability structure.
Oil, minerals, and water
The project’s success also may help alter researchers’ perceptions about how they gather underground waste migration data. “The more data points you have, ultimately the better information you get,” he says. “But 60 square meters quickly becomes a huge problem. We want to help in the design of future geological surveys so the best data sets can be collected for 3-D modeling applications.”
He says Sandia has helped Lawrence Livermore National Laboratory with an oil field survey and is planning to use the data-interpretation method at a simulated waste site at Idaho National Engineering Laboratory.
Ultimately, they hope the project and others like it will be useful for measuring high permeability zones (to aid in oil recovery operations and hydrology studies), as well as for mapping mineral deposits (which tend to be more conductive than their host rock).
“This project really pushed the envelope on what we can do with the supercomputer,” Alumbaugh says.