Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Simulating Galaxies With Supercomputers

CmdrTaco posted about 4 years ago | from the delete-the-pool-ladder-quick dept.

Space 120

An anonymous reader writes "Over in the UK Durham University is tasking its supercomputing cluster with nothing less than recreating how galaxies are born and evolve over the course of billions of years. Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible."

cancel ×

120 comments

Sorry! There are no comments related to the filter you selected.

Should have... (-1, Flamebait)

americamatrix (658742) | about 4 years ago | (#33577478)

used 800 Intel cores ;)

Re:Should have... (5, Insightful)

toastar (573882) | about 4 years ago | (#33577614)

800 cores.... that's like 134 CPU's, With 4 CPU's per node, it's only 34 Nodes. A rack holds 48u.

So they have a problem that takes more then one rack of modern computers to handle?

Old news (0)

Anonymous Coward | about 4 years ago | (#33577650)

People have been doing this for ages...what is the news here?

Re:Should have... (1)

guruevi (827432) | about 4 years ago | (#33577708)

That's what I was thinking. The University I work for has not just one but 3 clusters with 2 of them having 4,096 CPU cores and 848 CPU cores

Re:Should have... (1)

AnonGCB (1398517) | about 4 years ago | (#33577834)

Do you mean 4096 CPU cores and 848 actual CPUs?

Re:Should have... (1)

jgagnon (1663075) | about 4 years ago | (#33578100)

More likely he meant one had 4096 cores and another had 848 cores.

Re:Should have... (0)

Anonymous Coward | about 4 years ago | (#33581170)

And they let one postdoc use the whole thing for years at a time, or do they force everyone to share?

Re:Should have... (1, Interesting)

Anonymous Coward | about 4 years ago | (#33578362)

They should have talked to SuperMicro.
That's just over 8 enclosures (4 nodes/enclosure) and fits in 18U.

It looks like this [supermicro.com] .

Dual GTX 480 (4, Funny)

mangu (126918) | about 4 years ago | (#33579110)

800 cores.... that's like 134 CPU's

Or two GPUs [nvidia.com] .

If it can run Crysis it can simulate galaxies.

Re:Should have... (4, Funny)

An ominous Cow art (320322) | about 4 years ago | (#33577928)

This is nothing. For my kindergarten thesis, I used galaxies to simulate supercomputers.

Re:Should have... (1)

BitZtream (692029) | about 4 years ago | (#33578052)

Its like to to be far more realistic since its based on the real physics of the universe, rather than the simulations which are based on simulations of made up rules for the universe.

Simulating Galexies sure... (-1, Redundant)

Meskarune (988797) | about 4 years ago | (#33577486)

but does it run linux?

Re:Simulating Galexies sure... (1)

mehrotra.akash (1539473) | about 4 years ago | (#33577584)

but does it run linux?

guess it doesnt run windows, though I would like to see the processor graphs of a 800 core machine in the Task Scheduler

Re:Simulating Galexies sure... (0)

Anonymous Coward | about 4 years ago | (#33577628)

An 800 core machine is almost enough to run Vista.

Re:Simulating Galexies sure... (1)

mehrotra.akash (1539473) | about 4 years ago | (#33577732)

the task manager alone would take up a few dozen cores

Re:Simulating Galexies sure... (1)

MrHanky (141717) | about 4 years ago | (#33577678)

It runs on storks.

Re:Simulating Galexies (-1, Offtopic)

$RANDOMLUSER (804576) | about 4 years ago | (#33577842)

Why a supercomputer? Why don't they use a vibrator like everyone else?

can it run crysis 2? (-1, Redundant)

Joe The Dragon (967727) | about 4 years ago | (#33577496)

can it run crysis 2?

Re:can it run crysis 2? (1)

Antisyzygy (1495469) | about 4 years ago | (#33577560)

It could probably run crysis 2 on a hundred virtual machines at the same time.

Re:can it run crysis 2? (1)

mehrotra.akash (1539473) | about 4 years ago | (#33577758)

not necessarily, 100 VM's means 8 Cores and 16GB RAM per VM.
assuming that these cores are not equivalent to Atom cores, but something faster, it still doesnt say anything about the graphics hardware

Re:can it run crysis 2? (2, Interesting)

Antisyzygy (1495469) | about 4 years ago | (#33577878)

8 cores = 2-3 cores + 4 GB for the game's cpu and memory requirements and 5-6 cores + 12 GB to model a GPU with dedicated memory access. Not sure if GPU modelling has ever been done before, but I bet its possible with that much cpu access and memory. Games used to run with software graphics acceleration back when I was in grade school. I remember I bought my first dedicated graphics card back in the Voodoo 3 days and could start selecting "Hardware Acceleration" in PC games.

Re:can it run crysis 2? (1)

pipatron (966506) | about 4 years ago | (#33578654)

You seem to forget that emulating a modern GPU would require somewhat up to a hundred cores from a generic CPU. This is why they exist as a separate component.

Re:can it run crysis 2? (1)

Toonol (1057698) | about 4 years ago | (#33579294)

You seem to forget that emulating a modern GPU would require somewhat up to a hundred cores from a generic CPU.

I don't believe that at all; it sounds like marketing-speak. Intel's using, what, eight CPUs to do real-time RAY TRACING, and that's MORE demanding than the rasterizering paradigm that modern GPUs are based on. Certainly a GPU is more specialized and efficient than a similar-scale general purpose CPU, but I think the performance ratio is closer to 4::1 than 100::1.

Re:can it run crysis 2? (1)

Ironhandx (1762146) | about 4 years ago | (#33579672)

IIRC those CPUs have some GPU components built into the die which is why that is possible. In a straight competition on most normal GFX rendering type equations the actual is closer to the 100:1 than 4:1.

Please correct me if I'm wrong. However it very much depends on the equations. There are things that GPUs can do but are bad at and things they can't do at all.

Either way having a good setup to make use of the strengths of both types of processor is going to be the optimal solution.

Re:can it run crysis 2? (1)

IndustrialComplex (975015) | about 4 years ago | (#33579874)

I don't believe that at all; it sounds like marketing-speak. Intel's using, what, eight CPUs to do real-time RAY TRACING, and that's MORE demanding than the rasterizering paradigm that modern GPUs are based on. Certainly a GPU is more specialized and efficient than a similar-scale general purpose CPU, but I think the performance ratio is closer to 4::1 than 100::1.

It's all in the design. (YAY! Car analogy time)

You have to transport 1,000 people from NY to Miami faster than another person. Complete the challenge and win $10M. There are two vehicles, you pick first:

2 seat Ferrari
40 seat Bus

GPUs DO get a boost from things like just having to worry about processing graphics as opposed to managing the computer, but they get a MUCH bigger boost due to the fact that they are optimized to do certain tasks very quickly.

The Bus is designed to move a lot of people, the Ferrari is designed to move 1 or two people very quickly, and you can easily argue that the Ferrari is the much more powerful, advanced, carefully engineered design. The Ray Tracing activity is a bit like taking the Ferarri engine out and using it to power a pump and then taking the bus engine out and using it as well. Neither were really designed to do that, so it doesn't really work as a good comparison.

There is a reason why it was hard to emulate something like the Playstation for a long time even though computer processors were orders of magnitude more powerful than the one in the original Playstation.

Re:can it run crysis 2? (1)

pipatron (966506) | about 4 years ago | (#33581334)

You can think about whatever ratio that makes you happy, that doesn't change actual ratios shown when implementing algorithms in the GPU vs a general purpose CPU. A GPU does hundreds of very simple calculations in parallel, the CPU doesn't, it's quite simple.

Re:can it run crysis 2? (1)

sleeping143 (1523137) | about 4 years ago | (#33579354)

Actually, comparing FLOPS, about 10 Core i7 980 XE (107 GFLOPS) processors could handle the work of a GTX 285 (1062 GFLOPS). Add another one to orchestrate the whole mess, and you should be good to go. Of course, your latency will most likely be significantly increased...

Re:can it run crysis 2? (0)

Anonymous Coward | about 4 years ago | (#33577968)

It could probably run crysis 2 on a hundred virtual machines at the same time.

Or Aero.

Well... (0)

Anonymous Coward | about 4 years ago | (#33577504)

How about using a galaxy model for an @home project and ask for extra CPUs/GPUs?

Re:Well... (0)

Anonymous Coward | about 4 years ago | (#33578626)

I was thinking the same thing but maybe this project won't break down into nice little chunks for us to work on at home?

Re:Well... (2, Informative)

Bootsy Collins (549938) | about 4 years ago | (#33579026)

That's correct. In a simulation like this, the most important physical effect to model is gravity. Gravity doesn't have a range. For each timestep in the simulation, all the mass in the simulation has to interact with all the other mass in the simulation. There are a variety of numerical tricks that people who write these codes use to make the problem feasible, so that the computation time required doesn't scale as N^2, with N = number of particles in the simulation. But even with these tricks, to calculate the force on an individual particle, you still have to care about the stuff outside your local volume. These are problems you have to solve when you parallelize your code. Distributing the problem in an @home fashion would require so much inter-participant communication that at this point, it wouldn't really be practical.

Re:Well... (0)

Anonymous Coward | about 4 years ago | (#33579372)

If the nodes are evenly distributed like a galaxy you can use that to your advantage.
The gravity affect on one node can be worked out from the distribution of other nodes.

Re:Well... (1)

IndustrialComplex (975015) | about 4 years ago | (#33580072)

Gravity doesn't have a range.

That's a rather 'tricky' statement don't you think? First, I'll agree with you in that gravity doesn't technically reach zero. But it does appear to have to propagate. In a system many thousand of lightyears across, propagation delay would be significant.

Not only that, but wouldn't the galaxy have expanded several million, if not billions of miles in the 27,000 years it would take for light to travel from one end to the other? (I'm not trusting my back of the envelope calcuations which put it at expanding 500 billion miles over 27,000 years)

Re:Well... (1)

Bootsy Collins (549938) | about 4 years ago | (#33581812)

Gravity doesn't have a range.

That's a rather 'tricky' statement don't you think? First, I'll agree with you in that gravity doesn't technically reach zero. But it does appear to have to propagate. In a system many thousand of lightyears across, propagation delay would be significant.

I'm not sure how this makes my statement about gravity not having a range "tricky"; but it's definitely something worth thinking about. Several thousand lightyears is actually a pretty small-scale simulation; the simulation volume wouldn't be large enough to contain a typical bright galaxy. Cosmological simulations incorporating galaxy formation typically use volumes tens or even hundreds of megaparsecs (Mpc) on a side. Imagine you're working with a 100 Mpc per-side cube (for the cognoscenti: taking h=1 because I don't feel like carrying notation around). You typically use periodic boundary conditions, so the furthest mass from you is 50 Mpc away. Light travels at about 300 Mpc/Gyr; so it'll take about 1/6 Gyr for changes in the distribution of sources of the gravitational field to impact the field halfway across the box. That's a long time. So you might imagine that this is a major issue.

However, in one of these simulations, when evaluating the gravitational potential at the location of a point mass of interest, it's the stuff out at the largest scales that's treated in the most simplistic fashion -- that's how they avoid having to calculate pair forces between all the point masses in the simulation, which would make the computational time scale as N^2. Generally, you do some simplistic calculation of the gravitational potential at the point mass of interest, using a simplification of the mass distribution at large scales, and then calculate pair forces between the point mass of interest and others very nearby as a correction to the simplistic calculation. This simplification introduces error; but the folks who work on this are generally careful to set it up so that the error introduced is small enough to be acceptable (how they do that and can feel pretty confident about it would have to be another post), given that simplifications like this permit you to do the simulation in the first place. So, to my point mass of interest, the distant stuff 50 Mpc away is represented at low resolution. Typical peculiar velocities of masses in intergalactic space are 500 km/s and under, often well under. 500km/s is about 0.5 Mpc/Gyr. So in that delay time of 1/6 Gyr we were talking about earlier, distant stuff will have typically moved under 100 kpc. Compared to how we simplify the mass distribution at large distances to make the problem computationally tractable, that's nothing. There are some simulations where you have to care about this; but in general, ignoring it doesn't introduce as much error as you might think.

Incidentally, this kind of simplification helps parallelization quite a bit. If volumes of the simulation are allocated to nodes of a cluster, two nodes considering portions of the simulation volume which are very distant from each other may care about little more than the total mass in the other node's volume and where the center of mass of that stuff is located.

Not only that, but wouldn't the galaxy have expanded several million, if not billions of miles in the 27,000 years it would take for light to travel from one end to the other? (I'm not trusting my back of the envelope calcuations which put it at expanding 500 billion miles over 27,000 years)

This is only a significant effect for separations that are not small compared to the size of the theoretical horizon. And at any rate, it's straightforward to consider. The simulations are done in comoving coordinates (that is, coordinates that expand with the expansion). Working in comoving coordinates introduces some powers of the scale factor (well, inverse factors typically) that you have to include in the equations you solve; but that's OK.

800 AMD processor cores (3, Interesting)

mehrotra.akash (1539473) | about 4 years ago | (#33577550)

800 AMD processor cores, that knowledge is useless, need more info regarding that, are they ultra low power ones like Atom/Bobcat, or extremely high clocked, such as the i7 980x/ Phenom x6 1090,etc

Also article says that they have 1600GB RAM, isnt RAM normally in powers of 2?

Re:800 AMD processor cores (0)

Anonymous Coward | about 4 years ago | (#33577824)

2GB per core?

Re:800 AMD processor cores (1)

blueg3 (192743) | about 4 years ago | (#33577876)

Key to puzzling this out: it's a computing cluster, and 800 isn't a power of 2 either.

Re:800 AMD processor cores (2, Interesting)

Antisyzygy (1495469) | about 4 years ago | (#33577964)

Id guess they are running Opteron cpu's, maybe up to 8 core. So that means 50-100 machines in a cluster. 1600 / 50 = 32 gb per machine OR 1600/ 100 = 16 gb per machine.

Re:800 AMD processor cores (1)

gerddie (173963) | about 4 years ago | (#33577982)

How about 100 nodes of 2x Phenom x4 @16GB?

Re:800 AMD processor cores (1)

Antisyzygy (1495469) | about 4 years ago | (#33578000)

Id guess they are running Opteron cpu's, maybe up to 8 core. So that means 50-100 machines in a cluster. 1600 / 50 = 32 gb per machine OR 1600/ 100 = 16 gb per machine.

And that does not preclude them using Dual Socket MoBo's with two - 4 core Opterons.

Re:800 AMD processor cores (3, Informative)

rubycodez (864176) | about 4 years ago | (#33578090)

mostly opteron 175 (528 of them at 2.2 GHz with 1056GB RAM totl) and 285 (256 of them at 2.6GHz with 512GB RAM tota), so about 2GB RAM each.

they run Solaris 10 u3

http://icc.dur.ac.uk/icc.php?content=Computing/Cosma [dur.ac.uk]

Re:800 AMD processor cores (0)

Anonymous Coward | about 4 years ago | (#33578612)

they run Solaris

No wonder the computers are too slow. *ducks*

Re:800 AMD processor cores (1)

Jeremy Erwin (2054) | about 4 years ago | (#33580270)

The mystery is solved. Their cluster is old and slow,

Re:800 AMD processor cores (2, Informative)

Anonymous Coward | about 4 years ago | (#33579970)

Details are here:
http://icc.dur.ac.uk/icc.php?content=Computing/Cosma

The Cosmology Machine (COSMA) was first switched on in July 2001. From the original system only 2 TByte of dataspace is still on line. 64 SunBlade 1000s with a total of 64 GByte of RAM were donated by the ICC in February 2006 to the Kigali Institute for Science and Technology in Kigali, the Capital of Rwanda. Sun Microsystems payed for their transport by air.

In February 2004, QUINTOR was installed. QUINTOR consists of a 256 SunFire V210s with a total of 512 UltraSparc IIIi 1 GHz processors and 576 GByte of RAM. The systems have recently been upgraded to Solaris 10 u3, the Studio 12 compilers are the default and the Sun HPC - CT7 is used for parallel MPI applications. CT7 is Sun's packaging for Open MPI 1.2.x
In April 2006, OCTOR was installed. OCTOR consists of

      1. Cordelia, a 264 X2100 node cluster with a total of
                    * 528 2.2 GHz AMD Opteron cores (AMD Opteron 175)
                    * 1056 GByte of RAM
                    * connected via dual gigabit over Nortel 5510 Switches
      2. Miranda, a 64 X4100 node cluster with a total of
                    * 256 2.6 GHz AMD Opteron cores (AMD Opteron 285)
                    * 512 GByte of RAM
                    * connected via gigabit for system services
                    * connected via Myrinet 2000 F cards for HPC communications
      3. Oberon, a V40z with a total of
                    * 8 2.2 GHz AMD Opteron cores (AMD Opteron 875)
                    * 32 GByte of RAM
            Oberon is the head node of OCTOR.

        Octor
The systems are running Solaris 10 x86/64, the Studio12 compilers are default, but Studio11 and Studio10 are still available. Sun HPC - CT7 is used for parallel MPI applications.

In September 2007, we received rosalind, a V890 with 8 1.5 GHZ Ultra Sparc VI+ cores and 64 GByte of RAM. This was a donation by Sun Microsystems, which is gratefully acknowledged. Rosalind is running Solaris 10_u4, the Studio12 compilers and Sun HPC - CT7. This system has the role to provide access to single large memory and to packages such as IDL.

Re:800 AMD processor cores (1)

PhunkySchtuff (208108) | about 4 years ago | (#33582610)

1600GB RAM = 2GB RAM per core. That's a power of 2.

Easier way (3, Funny)

Yvan256 (722131) | about 4 years ago | (#33577564)

They should have asked The Doctor to simply record the event when he re-booted the Universe.

Re:Easier way (1)

click2005 (921437) | about 4 years ago | (#33577642)

The BBC did already did. Lets just hope they dont overwrite the tapes.

Re:Easier way (1)

gamecrusader (1684024) | about 4 years ago | (#33581588)

no we just need the doctor to loan us the TARDIS and scientist could learn how the universe started and how we aren't alone maybe even about the threats to our the planets including the Daleks, Cybermen, and the real policemen of the universe the Judoon the r real enforcers of the universe

Gaaahhhh (1, Insightful)

Anonymous Coward | about 4 years ago | (#33577742)

Let's simulate a single cell, then an organism, then aging. Then we can start extending our lifespan. THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.

Re:Gaaahhhh (1)

tom17 (659054) | about 4 years ago | (#33578578)

This. And quickly, i'm running out of time.

Re:Gaaahhhh (1)

Coren22 (1625475) | about 4 years ago | (#33579240)

I'm still waiting for my cyborg body...

Re:Gaaahhhh (1)

pipatron (966506) | about 4 years ago | (#33578674)

First of all, people are trying to do this already, probably on much more powerful hardware. Second, simulating an organism to the level that we need is probably a lot more demanding than simulating a galaxy to the level that these people need.

Re:Gaaahhhh (2, Insightful)

IndustrialComplex (975015) | about 4 years ago | (#33578824)

Let's simulate a single cell, then an organism, then aging. Then we can start extending our lifespan. THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.

What does an astrophysicist know about cellular biology? Probably about as much as a biologist knows about astrophysics.

Compounding that, we wouldn't have made a fraction of the scientific progress to date if we focused on a single discipline until it was mastered before moving onto the next one. What good would supersonic airliners be if our civil and material engineering knowledge never learned how to make a tarmac which could support their weight?

Ever wonder how much of our knowledge of high energy particles and fields came about because of cross-pollenation from physicists?

Re:Gaaahhhh (1)

dominious (1077089) | about 4 years ago | (#33580302)

I wonder if (theoretically) in a simulated galaxy, in which all particles and physical rules have been considered, some kind of simulated life-form can evolve from strange interactions of those particles and rules. Then, does this simulated life have any difference from real-life (reference to "The 13th floor")? I mean, for the living forms inside the simulation it will be like real life. I'm thinking if in theory there is anything to contradict this. Is it possible?

Re:Gaaahhhh (1)

timeOday (582209) | about 4 years ago | (#33580958)

THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.

Nature already has a process for eternal renewal - death and birth. Our species as a whole has no pre-determined expiration date, and the ability to pass information through the generations. What difference does it really make whether it's us individually that live on, or our descendants?

Re:Gaaahhhh (1)

allusionist (983106) | about 4 years ago | (#33582110)

I am not my descendants. I'd love to see the new discoveries that will be made about our place in the cosmos, the new technologies and art and culture that will be created in the distant future for myself, as would a great many others.

Use The Force , Luke , In The (0)

Anonymous Coward | about 4 years ago | (#33577750)

botnet cloud.

Yours In Osh,
K. Trout

too many atoms, not enough processors (1)

smallshot (1202439) | about 4 years ago | (#33577756)

did anyone consider ahead of time how many calculations would be necessary before they invested all that money?

Re:too many atoms, not enough processors (1)

mehrotra.akash (1539473) | about 4 years ago | (#33577782)

We wouldnt have had this story if they had

Does it run Windows? (1)

gmuslera (3436) | about 4 years ago | (#33577844)

Dont want to wait 15 billon years to see the next Blue Screen of Big Bang

Brain vs. Galaxy Simulation (3, Interesting)

Sonny Yatsen (603655) | about 4 years ago | (#33577890)

It's interesting to think that the university is attempting to use 800 processor cores to simulate galaxies, when IBM uses 147,456 processors to do a neuron-by-neuron simulation of the human brain.

Re:Brain vs. Galaxy Simulation (1)

Antisyzygy (1495469) | about 4 years ago | (#33578254)

Well, Id imagine it takes quite a bit longer than the IBM super computer to do an equal amount of work.

Re:Brain vs. Galaxy Simulation (1)

IndustrialComplex (975015) | about 4 years ago | (#33578896)

Well, Id imagine it takes quite a bit longer than the IBM super computer to do an equal amount of work.

An equal amount of computation, sure. But how much computation is necessary to get useful results? Both may not be working on problems of equal magnitude.

A backhoe can move more dirt than I can with a shovel, but if all I have to move is 1 cubic meter, and the backhoe has to move 1000... my workload is still a lot less.

Re:Brain vs. Galaxy Simulation (1)

OnePumpChump (1560417) | about 4 years ago | (#33578608)

He's too busy fighting shambling men in rubber masks.

Re:Brain vs. Galaxy Simulation (1)

tibit (1762298) | about 4 years ago | (#33579798)

They don't do a simulation of the entire brain, just a part of the cortex. And their simulation runs at maybe 1% of real time.

Re:Brain vs. Galaxy Simulation (1)

quanticle (843097) | about 4 years ago | (#33582406)

There's a big difference in the problem. Namely, its possible to work at a coarser level of granularity when dealing with galaxies. You might not be able to simulate individual stars, but you can simulate star clusters and the clumps of dark matter to get approximations. With the brain simulation, its not possible to abstract away as much detail, hence the higher hardware requirements.

How long before ... (2, Interesting)

cowtamer (311087) | about 4 years ago | (#33577988)

The galaxies in the simulation develop planets, scientists, and their own Galaxy Simulators???

Has anyone else been bothered the fact that energy is quantized? It always made me feel like we were looking at pixels we weren't supposed to see :)

Re:How long before ... (1)

tom17 (659054) | about 4 years ago | (#33578616)

Well it'd be a bit of a 'round-the-houses' method, but we'd have invented AI.. Finally!

Re:How long before ... (1)

IndustrialComplex (975015) | about 4 years ago | (#33579690)

The galaxies in the simulation develop planets, scientists, and their own Galaxy Simulators???

Has anyone else been bothered the fact that energy is quantized? It always made me feel like we were looking at pixels we weren't supposed to see :)

Why should I be bothered, if you look at it just the right way, it looks like...

Turtles.

Re:How long before ... (2, Insightful)

dominious (1077089) | about 4 years ago | (#33580344)

Ref. to "The 13th floor". I just posted something similar a few posts before:)

Speed of light in a simulation (4, Interesting)

mangu (126918) | about 4 years ago | (#33580522)

Has anyone else been bothered the fact that energy is quantized?

Even more significant is that there's an intrinsic speed limitation [wikipedia.org] in a simulation.

When you simulate a continuous medium by dividing it into small space and time steps, there's a speed "c" that's equal to the space step divided by the time step which cannot be exceeded by anything in the simulation.

Re:Speed of light in a simulation (0)

Anonymous Coward | about 4 years ago | (#33580648)

Would the Planck Time, tp, be a sufficient level of granularity for your simulations?

Re:Speed of light in a simulation (1)

Beardydog (716221) | about 4 years ago | (#33581608)

Unless you skip pixels OMG QUANTUM TUNNELING!!1!

In conclusion, God sucks at collision checking.

What is is new here? (0, Insightful)

Anonymous Coward | about 4 years ago | (#33578032)

Haven't researchers been doing such simulations for a decade?

Waste of Time (1, Informative)

MrTripps (1306469) | about 4 years ago | (#33578040)

Let me save those guys some time: 42

Re:Waste of Time (1, Interesting)

Anonymous Coward | about 4 years ago | (#33578232)

Let me save those guys some time: 42

What were the input params again?

Re:Waste of Time (2, Informative)

tom17 (659054) | about 4 years ago | (#33578636)

"what do you get if you multiply six by nine"

Re:Waste of Time (3, Funny)

damien_kane (519267) | about 4 years ago | (#33579178)

I always thought something was fundamentally wrong with the universe

Re:Waste of Time (1)

tom17 (659054) | about 4 years ago | (#33582354)

Informative? Seriously?

Obligatory XKCD (1, Funny)

Ryanrule (1657199) | about 4 years ago | (#33578198)

Obligatory XKCD http://xkcd.com/505/ [xkcd.com]

Meaningless uninformed journalist bs (3, Insightful)

vlm (69642) | about 4 years ago | (#33578234)

Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible..

Meaningless uninformed journalist bs filler puff. What is possible, is simulating every subatomic particle in the universe at planck time intervals for the total age of the universe, repeatedly for an infinite combination of different cosmological constants to see what you get. That will never be done, of course.

Re:Meaningless uninformed journalist bs (0)

Anonymous Coward | about 4 years ago | (#33578592)

Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible..

Meaningless uninformed journalist bs filler puff. What is possible, is simulating every subatomic particle in the universe at planck time intervals for the total age of the universe, repeatedly for an infinite combination of different cosmological constants to see what you get. That will never be done, of course.

Actually there are many who argue that our universe is exactly that! Which is to say one of those iterations in someone's experiment with a few constants fixed and the rest set to run randomly.

Re:Meaningless uninformed journalist bs (1)

froggymana (1896008) | about 4 years ago | (#33581144)

But how would you set up the computer to compute its own particles? If you really want a truly accurate representation of the entire universe you would need to include the computer doing this... I sense a paradox coming on.

Re:Meaningless uninformed journalist bs (1)

Tablizer (95088) | about 4 years ago | (#33581686)

Perhaps the simulator's universe is bigger than ours, or their computer is in a 4D universe or they found a way to tap into a 4th dimension.

Re:Meaningless uninformed journalist bs (0)

Anonymous Coward | about 4 years ago | (#33578980)

With quantum computers, it may be possible. So don't be too sure.

Re:Meaningless uninformed journalist bs (0)

Anonymous Coward | about 4 years ago | (#33580290)

Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible..

Meaningless uninformed journalist bs filler puff. What is possible, is simulating every subatomic particle in the universe at planck time intervals for the total age of the universe, repeatedly for an infinite combination of different cosmological constants to see what you get. That will never be done, of course.

It has happened before and it will happen again. What do you think this universe is? I, personally, can't wait to see the next simulation.

computing astronomers doing this for decades (3, Insightful)

peter303 (12292) | about 4 years ago | (#33578406)

Every year they can do more detail models. And they become clever in modeling. For example, aggregate gravity fields.

Of course it is overwelmed (1)

ATestR (1060586) | about 4 years ago | (#33578650)

Let's assume that they are trying to simulate the formation of a small galaxy... that would be no more than 100 million stellar masses. That's still a lot of points, a whole lot of calculations.

grape processors are faster (2, Informative)

Anonymous Coward | about 4 years ago | (#33578826)

the grape-5 does N-body simulations using specialized hardware that is faster than a standard CPU: http://en.wikipedia.org/wiki/Gravity_Pipe [wikipedia.org]

Recursive Loop? (1)

bigrockpeltr (1752472) | about 4 years ago | (#33579010)

What happens when the simulation get to the point where humanity is 'advanced' enough technologically to try to model the universe with supercomputers? its an obvious infinite loop that will cause the universe to crash.... and they are professors? sheesh

Better uses (1)

slapout (93640) | about 4 years ago | (#33579132)

Simulating galaxies?? Why not use it for something useful -- like ray tracing Wolf3d?!

Compared to what? (1)

Gemini8403 (1899998) | about 4 years ago | (#33579144)

Eight hundred cores of any type is TINY as far as supercomputers go. Most large US universities generally have at least one (if not several) supercomputers that are multiple times (if not an order of magnitude) larger than this. Never mind that most research projects on supercomputers NEVER use the whole system at once - it's more of a timeshare thing where you book however many threads for a certain length of time. Yes, the prospect of the research is interesting. That being said, other than that there isn't any impressive or really useful information in the article. What might be more intriguing is listing how long they plan on running the simulation and how many threads it's using.

curious (1)

sea4ever (1628181) | about 4 years ago | (#33579532)

Let them simulate the milky way. I'm curious as to whether or not they will be able to simulate the genesis of life on Earth. That will be interesting..
Hey, maybe if they let the simulation run long enough, the simulated earthlings will make their own simulation.

New Galaxy Porn (0)

Anonymous Coward | about 4 years ago | (#33579612)

At first I thought this said "Stimulating Galaxies with Supercomputers"...oooooooh baby

Fidelity of Mathematical Models (1)

florescent_beige (608235) | about 4 years ago | (#33579904)

There must be a principle out there somewhere that says the universe cannot be accurately simulated by anything smaller than the universe. And if there isn't can I invent it and call it The Principle of Computational Hopelessness?

Please Slashdot editors, (3, Interesting)

Prune (557140) | about 4 years ago | (#33580476)

be more careful with article summaries. They're wore than newspaper headlines these days. The "Over in the UK Durham University is tasking its supercomputing cluster with nothing less than recreating how galaxies are born and evolve over the course of billions of year" could describe any of the countless galaxy evolution simulations that have been done for a couple of decades already at various places, and gives no indication as to what's new about this instance. In other words, the headline is at best absolutely uninformative, and at worst, misleading.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?