| Back | Map | Glossary | Information |
In 1992, the four NSF-supported high performance computing centers --the San Diego Supercomputing Center (SDSC), the Pittsburgh Supercomputing Center (PSC), the Cornell Theory Center (CTC), and the National Center for Supercomputing Applications (NCSA), together with the National Center for Atmospheric Research (NCAR) joined in a collaboration called the National MetaCenter. Other regional and state-funded superomputing centers have also affiliated with the MetaCenter, further expanding its diversity and resources.
The raison d'être of the MetaCenter is to contribute to the development of a "National Computational Environment" that will support and strengthen the science and engineering research base across the entire country. The infrastructure provided by the MetaCenter gives researchers access to advanced technologies and intellectual resources. In particular, the MetaCenter will develop capabilities that enable scientists and engineers to move portions of their problems directly to appropriate computer architectures, without regard for where the computers are located. In other words, metacomputing.
But there is no single approach to metacomputing, no single architecture that's best for every application. Vector machines, massively parallel systems and scalable parallel workstation clusters, made by almost every major U.S. computer vendor, are all represented in the MetaCenter. NCSA's local metacomputer is based on a scalable microprocessor strategy. Other centers, such as the PSC, are developing heterogeneous metacomputers that rely heavily on software to link machines having different operating systems, processors and memory systems.
The National Machine Room
JPEG Image (54.9 KB); Credits and Copyrights
In the narrowest sense, the MetaCenter's National Computational Environment offers a diverse collection of machines operating at each center and connected by high-speed networks, in particular, the vBNS network. Fast networks, together with AFS, a system for sharing common file space, will eventually enable researchers to compute, visualize and archive their data using geographically-dispersed resources as a matter of routine.
Supercomputing '95 will provide a "reality check" for this scenario, as well as offering glimpses of the future of metacomputing -- on a national scale.
Return to What Is a Metacomputer