| Forward | Back | Map | Glossary | Information |
For a given application to run simultaneously on different machines, each machine will need access to a common shared file system to read and write data. Smaller files, such as application codes, can be stored in a distributed network file system called AFS (Andrew File System), developed at Carnegie Mellon University under IBM sponsorship.
Because files on remote AFS servers appear as though they are at any other site using AFS, the system simplifies collaborative research between geographically distant sites. At NCSA, all the major high performance machines share common file space via AFS. AFS, now in place at four of the NSF-funded high performance computing centers, allows a user working at one site running AFS to access his or her data at another site (also running AFS) without manually running time-consuming file transfers. For example, several Grand Challenge teams have used AFS to do their work at both NCSA and PSC.
Forward to Serving Data
Return to the Parts of the Metacomputer