Expo/Computation/The Metacomputer

| Forward | Back | Map | Glossary | Information |

Metacomputing: A Reality Check

Reality Check Banner

Metacomputing across the continent sounds great--on paper. But could it really work? That's one of the questions the organizing committee of Supercomputing '95 was asking. Supercomputing '95, the eighth annual conference for the presentation and discussion of research in high-performance computing and communications, took place between December 3-8, 1995 in San Diego.

At this major annual event, a Global Information and Infrastructure (GII) testbed offered scientists a chance to transmit simulation data remotely computed in their own numerical laboratories over the I-WAY, a very high speed network linking dozens of the nation's fastest computers and advanced visualization environments. This cooperative interconnection of multiple, high performance, national ATM networks was designed to link dozens of the country's most powerful computers and advanced virtual environments to onsite computational resources in San Diego. These resources included onsite virtual environments which, in some cases, allowed viewers to interact with precomputed data, control simulations as they ran on remote supercomputers, or even steer remote instruments.

Here's a sampling of NCSA projects that were demonstrated at Supercomputing '95:


Whispers from the Cosmos Exhibit Banner Radio Astronomy Synthesis Imaging

Steps towards Realtime Astronomy

NCSA and UIUC astronomer Richard Crutcher and his colleagues attempted to show how powerful distributed computional resources will drive the realtime astronomy in coming decades. Their goal was to send radio signal data (gathered from a radio telescope array at Hat Creek, California) via the I-Way across two-thirds of the country, to NCSA for processing on the SGI Power Challenge Array. The processed data would then be transmitted back over the network, where the scientists, using a large-screen projections onto the "Wall," were to visualize and control the transformation of the data into images.


Spacetime Wrinkles Exhibit Banner Distributing Spacetime

Computing and Visualizing Einstein's Gravitational Waves

NCSA's Relativity Group is using metacomputing tools to solve the Einstein equations for strong gravitational fields. Solving the equations in 3-dimensions, which requires enormous amounts of computation, will help scientists understand what happens when two black holes collide, as well as predict the pattern of gravitational waves that would emerge. University of Illinois physicist, Ed Seidel, and his colleagues aimed to run the computation on a group of networked machines from a number of NSF supercomputer sites. The goal was to enable viewers in the CAVE at San Diego to see and interact with Einstein's gravitational waves in simulation.


Cosmos in a Computer Exhibit Banner The Virtual Cosmos

Simulating and Visualizing Virtual Universes Using Distributed Multiscale Algorithms

Simulating the formation of large scale structure in the universe is a problem requiring enormous computing power. Members of the Grand Challenge Cosmology Consortium (GC3) used the combined computational resources of a number of supercomputing centers, then transmitted the data to San Diego, and displayed their "virtual" universes in the onsite CAVE and on the Wall. The goal was to permit viewers to interactively fly through galaxies 10,000 light years across, or even galaxy superclusters (100 million light years in diameter).


Imaging Group Banner ><A HREF=Experimentalist's Virtual Acquisition Console

Exploring Inner Space -- at a Distance

Researchers from the UIUC and the Beckman Institute at the UIUC, and NCSA's Biological Imaging Group used the I-WAY to control an electron microscope, an MRI Imaging Spectrometer, and a scanning probe microscope, all located at the Beckman Institute at the University of Illinois. From two thousand miles away, from within the virtual environment of the CAVE at Supercomputing'95, they managed to interact with and control their experiments.


Prototyping in Parallel

Aside from the emerging I-WAY, there are a number of other testbed projects designed to examine and further develop the technologies of metacomputing. For example:

Joint NSF-NASA Initiative on Evaluation (JNNIE)

The National Science Foundation (NSF) and the National Aeronautics and Space Administration (NASA) are teaming up to evaluate the effectiveness of a variety of scalable parallel computing systems under realistic scientific workloads. NASA-supported R&D activities tend to be more narrowly-defined and mission-oriented, while NSF sponsors a broader spectrum of scientific and computational projects. According to the rationale of the joint-initiative, if scientists from both agencies tend to use scalable parallel computing systems in much the same way, the case will be strengthened for the development of truly "general purpose, " high performance computing systems, i.e. technologies that can be used across most applications in science or engineering.

University of Virginia's Legion Project

In this testbed initiative computer scientists aim to assemble a campus-wide "virtual computer, " a prototype of a metacomputer that will draw upon local, regional, and national networks to connect workstations and both vector and parallel supercomputers. The Legion project will be demonstrated at Supercomputing '95.

Forward to the MetaCenter
Return to What Is a Metacomputer

Exhibit Map
Glossary
Information Center

Copyright, (c) 1995: Board of Trustees, University of Illinois


NCSA. Last modified 10/19/95.