Supercomputers: The Cray that created the world
Recreating the history of the universe is no mean feat, even with a 672-processor. Tony Harrington reports
An international group of cosmologists ? known as the Virgo Consortium ? made computing history in April this year by using a Cray T3E 672-processor machine to simulate the evolution of the universe.
The simulation lasted three days and brought the Munich-based processor to its knees, generating a terabyte of output in the process.
The Virgo Consortium aims to study the emergence of large-scale structures in the universe. Its simulation was fittingly the biggest, by orders of magnitude, carried out anywhere in the world.
Carlos Frenck, Professor of Astrophysics at Durham University and the principal investigator at the UK end of the Virgo Consortium, says the simulation has gone a long way towards answering two of the most pressing questions in astronomy and physics today.
The first of these concerns the nature of dark matter in the universe. The consortium?s results can best be explained, Frenck says, if dark matter is composed of an exotic sub-atomic particle, which has yet to be discovered. Physicists are desperately trying to capture this exotic particle because it could lead to fame and fortune, but the particle interacts weakly, and is difficult to locate. The group?s work lends substantial weight to the idea that the exotic particle, far from being just one theory among many, could very well be a real entity just waiting to be discovered.
The second question involves the proposition that there is another force in nature ? a so-called repulsive force ? to rival the attractive force of gravity. This has been put forward by two groups of astronomers studying supernovae. The consortium?s results agree with the existence of such a force.
Frenck says: ?The supernovae astronomers want to know if the data they have, which shows the existence of such a force, is a result of errors or if it describes something real. We can say to them: ?we think your data is probably right?. The repulsive force makes sense because when we write it into the script, we get a good fit between our simulation and the actual universe.?
Does all this matter? Not in our lifetimes, but Frenck says that if the existence of such a force is proved, it would mean the universe continuing to expand forever. The alternative would be an eventual retraction under the influence of gravity: the reverse of the Big Bang. Instead, thanks to the repulsive force, the universe will simply go on spreading out, getting progressively older, colder and more settled in its ways.
The Virgo Consortium?s study provides a rather astonishing demonstration ? amid the endless measurements of real starlight taking place in astronomical circles around the world ? of just how far computer simulations can take us into unfolding the history of the universe.
To make its contribution, the consortium needed to work with large simulations, modelling cubes of space of 50Mpc (mega parsecs) and beyond. One parsec is the equivalent of the distance light travels in 3.26 light-years, with one light-year being the equivalent of about 5.8 trillion miles. Interested parties may work out the number of zeros involved in each side of the cube for themselves.
The idea was to cram this box of space with as many particles as possible ? particles representing dark and ordinary matter ? with different equations describing the behaviour of each of the two types of matter.
Frenck says that the starting conditions of the particles in the box are uniform, representing the early stages of matter after the Big Bang, except for a kind of genetic imprint. This imprint is the ripple effect of quantum forces on matter in the early universe, and determines the shape of what we see today.
The simulation was then rolled forward, allowing the galaxies to form and interact in complex, non-linear ways according to the laws of motion and hydro-dynamics. Gravitational tidal forces were taken into account. The universe which emerges from the simulation is then compared to the actual universe as seen through the eyes of the Hubble Space telescope, radio astronomy and so forth.
Different initial assumptions about the starting conditions give different outcomes.
Those outcomes that are obviously different to how our universe looks today mean that the initial assumptions should probably be filed in the bin.
A good fit, however, lends persuasive weight to the initial assumptions, making the simulation akin to a laboratory for testing the relationship between the laws of particle physics and the evolution of the universe.
This is heady stuff, and Frenck and his colleagues in the consortium are very ex-cited. ?It?s a very exciting time to be doing this kind of work. I don?t play the lottery, but if I did and I won it, I would be back at work the next day doing exactly the same thing,? he says.
It feels as if a great discovery is just around the corner. Frenck says astrophysicists trying to understand the genetic imprint which gives the universe its structure are better off than their colleagues in the biological sciences. ?Whereas biologists do not have the foggiest clue how structure develops, physicists have the laws of physics. Our areas of uncertainty lie in the nature and extent of the dark matter that makes up the vast majority of the matter in the universe, and this gives freedom to our calculations,? he says.
The steps being made by the consortium are relatively recent. Until the Edinburgh Parallel Computing Centre acquired a Cray T3D in 1995, UK and other European astrophysicists were being left behind by their counterparts in the US, who had access to far more powerful computers.
The arrival of the Edinburgh Cray, followed last year by a companion Cray T3E, gave the Virgo Consortium the kind of computing capacity required to run models on a scale large enough to deliver some impressive results.
The consortium?s access to increased IT power received a further boost when one of its members took up a directorship with the Munich-based Max Plank Institute for Astrophysics ? the home of the 672-processor Cray T3E used in the consortium?s record-breaking simulation. It may be worth noting that the Munich Cray is the world?s ninth-largest computer, or at least it was when the simulation was carried out. The consortium also has a stake in the Origin 2000 machine at Cambridge University.
The Munich Cray has a lot of raw power. Yet Frank Pearce, one of the members of the Virgo team, points out that when you have a billion particles each represented by three numbers for position, three for velocity and one for mass, multiplied by 32-bits or 64-bits, depending on your numbering system, you finish up with a program that makes very substantial demands on both processor power and RAM.
Frenck says that the Munich Cray T3E has around 65Gb of RAM, while the program he and his colleagues devised requires 64Gb. ?We ended up using every last nook and corner of the Cray, but we got our result,? Frenck says.
The group were timed against two other major consortia and a host of other users on the project. As it turned out, running the simulation proved to be the least of the consortium?s problems. Dealing with the terabyte of output generated by the simulation was the real headache.
Arthur Trew, director of the Edinburgh Parallel Computing Centre, says there is a huge imbalance in computing technology between i/o speeds and processor speeds, and this is affecting big simulations. ?You can have a calculation that is running at 100 Gigaflops, generating 100 billion new numbers a second, yet the fastest networks in the world will only take 250Kbps worth of single precision numbers,? Trew says.
Getting the output into usable form for distribution to interested astronomers and cosmologists around the world presented Frenck and the consortium with an interesting problem. Using statistical techniques, they post-processed the data down to a mere 100Gb. Getting this off the Cray and on to front-end processors took a week. The 100Gb data set was carved into 20Gb to 30Gb chunks and run on to tape for distribution by post.
?We had to think hard about how we could make this information public. Clearly, we couldn?t put it up on our Web site,? Frenck says. The 20-30Gb chunks are the minimum-base requirement, he says, for serious work. The actual analysis is then carried out piece-by-piece on Alpha workstations with 2Gb of RAM. ?You have to be clever and focused in the questions you ask, and you have to get the answers delivered efficiently or you get really bogged down,? he adds.
This whole project has stretched the resources of the consortium to the limit. ?We had five people working on this at one stage,? Frenck says. To the corporate IT industry this may not seem to be a lot of manpower to solve the puzzle of the formation of the universe. In academia, however, Frenck says, diverting the efforts of five people to a single project is an exorbitant strain on resources.
One thing is certain, until the amount of RAM available to super computers goes up by at least an order of magnitude together with the bandwidth for shipping out the data, the Virgo Consortium?s billion-particle, record-breaking experiment is unlikely to be topped.
?With hindsight, we can say that we were probably a bit ahead of ourselves with this simulation, Frenck says. He adds: ?The bottlenecks are so severe when you try to move the output from this kind of experiment, that we probably need to wait for memory capacities and bandwidth technologies to catch up with us before we do anything on this scale again.?