| Back | Map | Glossary | Information |
There are a number of science, engineering and social and economic problems whose solutions will require more powerful computers, better networks and storage solutions, and more advanced software than is currently available. These are the problems driving metacomputer development.
They include such problems as predicting the weather, analyzing fuel combustion, ocean modeling, the rational design of drugs, and determining the origin of structure in the universe. Tackling many of these problems require computer simulation, which has joined observation and theory as one of the fundamental paradigms of modern science.
Simulation requires that a mathematical model of physical phenomena be translated into a program that instructs the computer how to carry out calculations based on input data. Finding accurate solutions in a reasonable amount of time (i.e. hours) to very large and complex problems typically demands teraflop (trillion floating point operations per second) computer performance and 100 gigabyte memories--something that no single supercomputer can achieve today.
Combining the power of several high performance computers offers one path to achieving teraflop performance. But doing so will require fast networking systems that not only permit the individual computers to "talk" to eachother at speeds to match, but also support realtime interaction between the researcher and the data being computed.
For example, researchers who use numerical processing to remotely "steer" instruments such as telescopes or microscopes will demand nearly gigabit or even terabit per second transmission rates--something still beyond the capabilities of today's networks.
Archiving the enormous data sets typically generated by Grand Challenge applications presents another problem. Storage devices are required that can accept and output terabyte volumes of data at high transfer rates directly to or from the network; ideally the data, perhaps distributed among a number of different storage sites, should appear as an uniform, easy-to-access set of files to researchers.
A group of radio astronomers are
developing innovative computational tools and techniques to support collaborative
observations on faraway telescopes, as well as rapid image formation and data analysis.
Numerical cosmologists are using
supercomputers to simulate the origin of structure in the universe. Using
virtual environments, they navigate, analyze and interact with the data as
at the University of Illinois use high performance
computers to simulate the genesis and evolution of thunderstorms that spawn tornadoes.
They are also employing virtual reality systems to analyze and interact with huge
data sets stored at several sites around the country.
The Federal Government's High Performance Computing and Communicatation Program supports research into numerous other Grand Challenges: for example, aircraft design, environmental modeling and prediction, molecular biology and biomedical imaging, and space science.
The "national information superhighway," now somewhat of a tired cliché, is more properly known as a National Information Infrastructure (NII). The HPCC program is helping to develop much of the technology underlying the NII, in order to address National Challenges--fundamental applications that are information-intensive, and have broad and direct impact on the nation's competitiveness and the well-being of its citizens.
National Challenges aim to improve health care delivery, civil infrastructure, national security, design and manufacturing, efficiency among small and medium sized businesses, environmental monitoring, education training and lifelong learning, and access to information.
Tackling National Challenges, like Grand Challenges, will require integrated computational, network, storage, software, and virtual environment resources--in other words, the technologies of metacomputing.
Several National Challenge projects are already underway. Physicians at the National Jewish Center for Immunology and Respiratory Medicine in Denver, Colorado, in collaboration with researchers at the Los Alamos National Laboratory, have developed a telemedicine system based on a radiographic repository at Los Alamos. The application allows physicians from across the country to view radiographic data via a sophisticated multimedia interface, match a patient's radiographic information with the data in the repository, review treatment history and success, and determine the best treatment for the patient.
Other scientists are using virtual reality to literally--in this case--immerse themselves in their data. Researchers at Old Dominion University in Norfolk, Virginia, in collaboration with NCSA, have created a Chesapeake Bay Virtual Ecosystem, in an attempt to understand the complex interactions between Chesapeake Bay currents, seasonal changes in salinity, sealife, and a whole host of other factors. Their work benefits a complex, highly productive ecosystem that also happens to support important but steadily dwindling commercial species like the blue crab.
Return to Why we Need a Metacomputer