This site is not maintained. Click here for the new website of Richard Dawkins.

← Supercomputers Allow First Detailed Milky Way Simulation

Supercomputers Allow First Detailed Milky Way Simulation - Comments

nancynancy's Avatar Comment 1 by nancynancy

Beautiful and awe inspiring.

Wed, 31 Aug 2011 00:02:58 UTC | #865714

Neodarwinian's Avatar Comment 2 by Neodarwinian

Yes, much more awe inspiring than that desert dogma. Cecile B. De Mile could have made this epic, but he choose to promote fictional accounts with a magical plot line. Missing the natural magic shown here.

Wed, 31 Aug 2011 00:31:05 UTC | #865720

Schrodinger's Cat's Avatar Comment 3 by Schrodinger's Cat

According to Moore's law, a home PC should have enough power to create the same simulation ( in 8 months ) in just 20 years or so......which will be cool, as at the moment it's hard to find any home PC galaxy simulation that has more than about 2000 stars.

Wed, 31 Aug 2011 00:58:35 UTC | #865721

jbyrd's Avatar Comment 4 by jbyrd

Comment 3 by Schrodinger's Cat :

According to Moore's law, a home PC should have enough power to create the same simulation ( in 8 months ) in just 20 years or so......which will be cool, as at the moment it's hard to find any home PC galaxy simulation that has more than about 2000 stars.

Unfortunately, the trend of Moore's law should end about 2015 to 2020, as the transistors will eventually reach the limits of miniaturization at atomic levels:

Wed, 31 Aug 2011 02:53:10 UTC | #865753

υβ' Vlaanderen's Avatar Comment 5 by υβ' Vlaanderen

Very impressive! The computational power behind the simulation is honestly staggering.

But no doubt the creationists will have an answer to this soon enough. Any word yet on whether a papier-mache simulation of Noah's flood will be forthcoming?

Wed, 31 Aug 2011 03:03:07 UTC | #865758

rjohn19's Avatar Comment 6 by rjohn19

Not so sure Schrodiger- My computer has more than 2,000 stars, though most are female and topless.

Also agree with jbyrd- all good things must come to an end and I see no exception made for Moore's law. But even if Moore's law continued exponentially, we'd still not be able to use the computer's potential because the latest version of Windows would suck up any gains given us by chip makers.

It's a vicious, voracious circle.

Wed, 31 Aug 2011 03:13:56 UTC | #865760

Robert Howard's Avatar Comment 7 by Robert Howard

Comment 3 by Schrodinger's Cat :

According to Moore's law, a home PC should have enough power to create the same simulation ( in 8 months ) in just 20 years or so......which will be cool, as at the moment it's hard to find any home PC galaxy simulation that has more than about 2000 stars.

I don't think you're right about that. My computer came with a built-in screensaver where it's like all the stars are flying towards you. I just put it on and I definitely counted more than 2000 of them.

It's not as good as the one with all the fishes in the fishtank, though.

Wed, 31 Aug 2011 04:49:35 UTC | #865775

Rothron the Wise's Avatar Comment 8 by Rothron the Wise

I don't think you're right about that. My computer came with a built-in screensaver where it's like all the stars are flying towards you. I just put it on and I definitely counted more than 2000 of them.

Are you joking? 2000 might be a little low, but you can't compare the simple starfield screensaver to something that involves simulating gravity.

In a gravity simulation complexity scales as the square of the number of bodies in the simulation, because the gravity of every body needs to interact with the gravity of every other body.

This means that the difference in complexity between a 10 body simulation and a 10000 body simulation is 6 orders of magnitude.

Wed, 31 Aug 2011 06:44:55 UTC | #865783

Net's Avatar Comment 9 by Net

amazing. truly stunning stuff. i don't mind the music in the least. in fact, i love it. i still cringe a little when i read the mangled verb tenses but then remind myself that these guys are performing wonders, i was going to say "miracles", and doing it in a language which is not even their own. how many (native) speakers of english could pull that one off?

Wed, 31 Aug 2011 06:49:22 UTC | #865785

Robert Howard's Avatar Comment 10 by Robert Howard

Comment 8 by Rothron the Wise

Are you joking?

Yes.

Wed, 31 Aug 2011 07:09:52 UTC | #865789

hemidemisemigod's Avatar Comment 11 by hemidemisemigod

Supercomputer? Bah!

I simulated this sort of thing years ago with nothing more than a hot cup of tea, a teaspoon and some dodgy milk.

I was surprised to learn recently that the Andromeda Galaxy (which incidentally is on a collision course with ours) is so very large in the night sky. If we could see the faint stars that form its outer perimeter, it would appear more than 6 times the width of the moon.

Wed, 31 Aug 2011 08:41:28 UTC | #865804

Schrodinger's Cat's Avatar Comment 12 by Schrodinger's Cat

Comment 8 by Rothron the Wise

2000 might be a little low

It is. I suspect my PC could handle 20,000 with less than a day's simulation.

One of the best simulator tools for PCs is Universe Sandbox. Excellent for stars and planets........however at the level of galaxies, though one may have 100,000 stars, for the purposes of galaxy collisions each galaxy is treated as just a core gravitational object at the center...which is not that realistic.

Wed, 31 Aug 2011 10:41:06 UTC | #865824

justinesaracen's Avatar Comment 13 by justinesaracen

Looks like some cheapo special effects to me.

Oh oh. Have I just started another conspiracy theory?

But srsly, you mean those galaxy simulations actually CONTAIN individual stars? How do they do that?

Wed, 31 Aug 2011 11:20:19 UTC | #865831

ANTIcarrot's Avatar Comment 14 by ANTIcarrot

Moores law will almost certainly continue past 2020; depending on whether you take the 'power/cost' or 'transistors/area' definition:

Once you've shrunk transistors as far as they will go, then you can start to layer them

Once you've hit the absolute limit, you can still make your fabrication technology much cheaper

Advanced materials (graphene etc) are likely to have lower cooling requirements

We haven't scratched the surface on clever ways to cool CPUs.

Wed, 31 Aug 2011 12:24:46 UTC | #865851

Schrodinger's Cat's Avatar Comment 15 by Schrodinger's Cat

Comment 11 by hemidemisemigod

I was surprised to learn recently that the Andromeda Galaxy (which incidentally is on a collision course with ours) is so very large in the night sky. If we could see the faint stars that form its outer perimeter, it would appear more than 6 times the width of the moon.

Galaxies are really quite faint. The spectacular Hubble images of the Sombrero galaxy, or images such as M51, are thousands of times brighter that what you would see of you were really at those distances from those galaxies. Andromeda is a mere 2 million light years away.....just 20 times the diameter of our galaxy, and pretty much all you can faintly see with the naked eye is the central bulge. Alas, we need eyes several meters across to see a Hubble type universe.

A cool Hubble simulation....

http://www.youtube.com/watch?v=e02aB49TFUA

Wed, 31 Aug 2011 12:45:03 UTC | #865857

BroughtyBoy's Avatar Comment 16 by BroughtyBoy

The making of Candy Floss? Sorry, but I`m unimpressed.

Wed, 31 Aug 2011 13:45:12 UTC | #865882

boogerjames's Avatar Comment 17 by boogerjames

Am I the only one who had a complete nerdgasm?

Wed, 31 Aug 2011 14:22:16 UTC | #865891

Red Dog's Avatar Comment 18 by Red Dog

Comment 3 by Schrodinger's Cat :

According to Moore's law, a home PC should have enough power to create the same simulation ( in 8 months ) in just 20 years or so......which will be cool, as at the moment it's hard to find any home PC galaxy simulation that has more than about 2000 stars.

Not necessarily. Moore's law can't go on forever. Its essentially based on the fact that we can squeeze more processing power into a smaller space. But eventually (and possibly fairly soon) we start hitting physical barriers based on the size of molecules and electrons so that it just can't be any tighter. According to Wikipedia "In 2003 Intel predicted the end would come between 2013 and 2018". There are theoretical ways to get around this but IMO they are all very immature and I think we are coming up fairly quickly to a time when the regular increases in computing power we take for granted may come to an end.

Wed, 31 Aug 2011 14:39:39 UTC | #865895

mmsood99's Avatar Comment 19 by mmsood99

It's a beautiful simulation, but I wonder if the results have been corrupted because the researchers knew their endpoint? The galaxy is known to have spiral arms, so the simulation has spiral arms. How much did knowing the endpoint affect the simulation?

If I were peer-reviewing this, I would be concentrating on the underlying theory and assumptions. One thing that must be true is that they did not treat all stars individually. If you accept the galaxy has about 1011 stars, then the pairwise complexity of calculating gravitational interaction scales as 1022. Smarter algorithms, such as kd-trees can reduce this, but they must have made some assumptions to simplify the complexity.

Wed, 31 Aug 2011 15:23:29 UTC | #865913

huzonfurst's Avatar Comment 20 by huzonfurst

As for Moore's law, so far no one has considered a truly revolutionary way of computing (avoiding the phrase "paradigm shift" for now, although it may turn out to be appropriate), and there is one on the horizon called quantum computing. Not in the Deepak Chopra sense, of course, but in the sense of multiple operations being carried out simultaneously to provide exponential increases in computing power. Holographic storage has already been accomplished to some extent, so don't write off Moore just yet.

I have also heard of giving bits more depth than just 0 and 1, perhaps levels from 0 to 3 as a start, and who knows how far that could go? Think of where physics was only a century ago and just imagine what another century could bring at our accelerated pace of discovery!

Wed, 31 Aug 2011 15:46:56 UTC | #865920

Stevezar's Avatar Comment 21 by Stevezar

Comment 20 by huzonfurst :

As for Moore's law, so far no one has considered a truly revolutionary way of computing (avoiding the phrase "paradigm shift" for now, although it may turn out to be appropriate), and there is one on the horizon called quantum computing. Not in the Deepak Chopra sense, of course, but in the sense of multiple operations being carried out simultaneously to provide exponential increases in computing power. Holographic storage has already been accomplished to some extent, so don't write off Moore just yet. I have also heard of giving bits more depth than just 0 and 1, perhaps levels from 0 to 3 as a start, and who knows how far that could go? Think of where physics was only a century ago and just imagine what another century could bring at our accelerated pace of discovery!

Yes, Moores law has been under a perpetual death sentence since it was first formed, Moore himself giving the first expiration date back around 1975.

Of course it can end, there is no garuntee one way or the other. But given the sorry results of all the end predictions up to now, I am just going to believe it when I see it. My guess is, 20 years from now, people will be saying "Well, Moore's law is going to end in another 5-10 years".

There is a huge amount of money at stake. Underestimate the mix of monkey brains and the profit motive at your own peril!

Wed, 31 Aug 2011 16:10:06 UTC | #865927

mmsood99's Avatar Comment 22 by mmsood99

Sheesh System took out my up arrows. Of course I meant 100,000,000,000 stars and a complexity of 100,000,000,000 squared.

Wed, 31 Aug 2011 16:11:30 UTC | #865928

Geraint's Avatar Comment 23 by Geraint

Comment 19 by mmsood99 It's a beautiful simulation, but I wonder if the results have been corrupted because the researchers knew their endpoint? The galaxy is known to have spiral arms, so the simulation has spiral arms. How much did knowing the endpoint affect the simulation?

Some more details are in the article. Apparently they use about 60 million particles, which isn't all that many compared to large cosmological simulations which can now contain tens of billions, but what's important here is the physics that is included in the modelling.

The formation of individual stars can't be tracked in a simulation of a whole galaxy, so you need some effective prescription which realistically accounts for where and when stars form, and the effects of star formation on the galaxy: the energy the stars emit during their lifetime and when they go supernova, the chemical enrichment of the interstellar medium, etc. One of the aims of this sort of simulation is to see just which physical processes are important in order to end up with a galaxy that looks like the sort of galaxies we see around us.

Knowing the endpoint doesn't really help at all. I'd have to look in the paper, but most simulations of this type start off with simple initial conditions motivated by observations of the cosmic microwave background: small fluctuations in density with the correct amplitude and scale. Typically, you run a simulation of a very large volume of the Universe, find an interesting region (e.g. one forming a galaxy), then go back and rerun the simulation at higher resolution, zoomed in on that interesting region. That ensures that the initial conditions aren't cooked and that the galaxy forms in the correct cosmological context. One can try to guess which regions are more likely to end up with nice disk galaxies (typically fairly quiet regions where this disk doesn't get disturbed by mergers with other galaxies etc.), run many simulations, and see what you end up with.

As for the underlying theory and assumptions, no doubt this simulation is an advance over previous ones, but (as far as I can tell from this little article, to know more I'd have to read the full paper properly) it's taking the next step on an established path, not doing anything dramatically different. Many groups are running simulations of this type.

Wed, 31 Aug 2011 16:45:30 UTC | #865933

Enders's Avatar Comment 24 by Enders

Comment 8 by Rothron the Wise :

I don't think you're right about that. My computer came with a built-in screensaver where it's like all the stars are flying towards you. I just put it on and I definitely counted more than 2000 of them.

Are you joking? 2000 might be a little low, but you can't compare the simple starfield screensaver to something that involves simulating gravity.

In a gravity simulation complexity scales as the square of the number of bodies in the simulation, because the gravity of every body needs to interact with the gravity of every other body.

This means that the difference in complexity between a 10 body simulation and a 10000 body simulation is 6 orders of magnitude.

I doubt they used the direct summation method without some enhancements.

Depending on the program (parallel CPU usage) and the PC (RAM, CPU cores) two 100000 star galaxies colliding can be simulated quite fast. I think "Gadget 2" even has a demo simulation of that. So if you have Linux you can compile it and run it on your PC to check how long it takes.

Wed, 31 Aug 2011 16:45:49 UTC | #865934

Red Dog's Avatar Comment 25 by Red Dog

Comment 21 by Stevezar :

Comment 20 by huzonfurst :

Yes, Moores law has been under a perpetual death sentence since it was first formed, Moore himself giving the first expiration date back around 1975.

I've been working in IT since the 1980's and I don't recall Moore's law being "under a perpetual death sentence". You are correct that when he first described the law it wasn't understood how long the increases could continue. But after that time, at least as far as I know, everyone in the IT world believed Moore's law would keep going until the point when we ran up against fundamental physical constraints such as the size of the atom. We are now getting fairly close to those limits.

Wed, 31 Aug 2011 17:02:10 UTC | #865938

jbyrd's Avatar Comment 26 by jbyrd

Comment 8 by Rothron the Wise :

I don't think you're right about that. My computer came with a built-in screensaver where it's like all the stars are flying towards you. I just put it on and I definitely counted more than 2000 of them.

Are you joking? 2000 might be a little low, but you can't compare the simple starfield screensaver to something that involves simulating gravity.

In a gravity simulation complexity scales as the square of the number of bodies in the simulation, because the gravity of every body needs to interact with the gravity of every other body.

This means that the difference in complexity between a 10 body simulation and a 10000 body simulation is 6 orders of magnitude.

Humor is often wasted here...

PS: I thought it was funny.

Wed, 31 Aug 2011 17:05:36 UTC | #865940

mmsood99's Avatar Comment 27 by mmsood99

@ Comment 23 by Geraint

That is exactly my point. Any model that has complexity must be parameterized, and I fear for a positive feedback mechanism whereby parameters are tweaked to fit an expected end point. The upshot of which is over-fitting.

The tweaking need not be willful. For example, given two different models of star agglomeration, it is tempting to take the one that most closely delivers the expected end point.

We have the same problem in our domain: modeling molecular interactions demands a parameterized model, and they never cross elements - so parameters derived for C atoms don't transfer to N (even though the underlying physical properties are used in the model, such as radius and charge density). This of course proves the model isn't "physical", but rather a parameterized simulation.

Wed, 31 Aug 2011 17:08:40 UTC | #865942

Geraint's Avatar Comment 28 by Geraint

The tweaking need not be willful. For example, given two different models of star agglomeration, it is tempting to take the one that most closely delivers the expected end point.

Well, how problematic this is depends on what you're trying to find out. The fact that we see spiral galaxies is an observational datum, and our models should be able to produce them or else they must either be wrong or incomplete.

If, for example, it was impossible to make spiral galaxies in a LambdaCDM cosmology whatever you assumed about star formation, that would be interesting to know. It's not interesting to know that 'LambdaCDM + the first model you thought of with some parameter values you chose arbitrarily because they can't be derived from first principles' can't produce spiral galaxies. That doesn't help you to isolate or test individual parts of the theory.

If you have a variety of star formation recipes which can make spiral galaxies, you can try to come up with tests to choose between them, and the simulations can help generate the predictions which you then test. Some may then be ruled out whatever parameters you assume, or might require inconsistent parameter choices to match different observations. But I don't really think the aim of these simulations is to fit the parameters of effective physical recipes, since that isn't really very informative about the smaller-scale processes which underlie the recipes. If it was possible to derive the parameters from smaller scale physics, or to simulate the smaller scales directly, you'd do that. In the face of genuine uncertainty, you don't want to abandon hope of learning about one part of a theory because another part is incomplete.

Wed, 31 Aug 2011 17:38:36 UTC | #865951

rocket888's Avatar Comment 29 by rocket888

This reminds me of one of the first programs I wrote long ago. It produced a graph. I showed it to my boss, who was expecting me to do the analysis by hand. He said "cool... hmmm, but is it right?".

How do you verify the accuracy of the program? If it doesn't produce the expected output, then you look for a bug. Forget about proving correctness, that fad simply showed it was probably more than 10x harder to prove a program correct than to write the program.

I concur with mmsood99. I bet if the result didn't produce what they knew was the result, they'd be looking to see what is wrong. But if it does produce the expected results, then what is the motivation to look for flaws?

With all the different shapes and types of galaxies, can we really expect that some "butterfly" effect billions of years ago does not change the result considerably?

I'm just a bit skeptical of computer models, having spent the last 20 years helping people to get the results right - but in our case, we did in fact know what the results should be given at least a particular input data set. Still, when run for real, plenty of surprises cropped up.

Wed, 31 Aug 2011 20:04:49 UTC | #865995

Mr DArcy's Avatar Comment 30 by Mr DArcy

I know nothing of computer programming, but if the programmers are trying work out a possible scenario for the formation of a galaxy such as the Milky Way, then it seems perfectly "fair" to me that they should have that observed reality as their end point. Doesn't the whole of cosmolgy have to work by extrapolating what we do know, back into the past?

Wed, 31 Aug 2011 20:26:06 UTC | #866011