D E C E M B E R 2 0 0 0

Computing News Back Issues

Computer Security

Computing Infrastructure Support (CIS)

CIS Services

Computing Standards

Software Downloads

Y2K Info


CIS Computer
Help Desk

CIS Help
Request Form


Unix Services

ISS

IMAP4

Calendaring


Subscription Information    

Berkeley Lab Buys 160-Processor Cluster Computer to Advance Scientific Computing and Research

In what could be a glimpse into the future of high-performance computing, Berkeley Lab Computing Sciences will buy and operate a 160-processor cluster computer to assess whether such system can meet the day-to-day production demands of a scientific computing center.

Clusters are assemblies of commodity computers designed and networked to operate as a single system. By using off-the-shelf components, clusters can provide a cost-effective balance between price and computer performance. To date, most large scientific commodity clusters are used more for specific research applications than general-purpose production resources for computational science.

Clusters are assemblies of commodity computers designed and networked to operate as a single system. By using off-the-shelf components, clusters can provide a cost-effective balance between price and computer performance. To date, most large scientific commodity clusters are used more for specific research applications than general-purpose production resources for computational science.

"There's been a lot of hype about clusters over the past few years in the world of high-performance computing, but clusters are just now coming into their own," said NERSC Division Director Horst Simon. "Our goal with this new system is to see how well a commercially built cluster can perform in a demanding scientific environment. The cluster architecture could well be the supercomputer of the future and it behooves us to carefully and thoroughly test such a system."


Return to Computing News