Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The Known Universe Scientifically Rendered For All to See (by AMNH)

  1. Jan 8, 2010 #1

    DevilsAvocado

    User Avatar
    Gold Member

    I just have to share the most beautiful and amazing video I have ever seen.

    The Known Universe, a new film produced by the American Museum of Natural History, is for everyone with just slightest interest of our place in the Universe. If you haven’t already seen it (+2 million views on YouTube in less than a month!), enjoy the trip of your life!

    The video is based on real data (Sloan Digital Sky Survey), not an artist’s conception.

    https://www.youtube.com/watch?v=<object width="640" height="505"><param name="movie" value="http://www.youtube.com/v/17jymDn0W6U&hl=en_US&fs=1&rel=0&color1=0x006699&color2=0x54abd6&hd=1"></param><param [Broken] name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/17jymDn0W6U&hl=en_US&fs=1&rel=0&color1=0x006699&color2=0x54abd6&hd=1" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="505"></embed></object>
    (Don’t miss the 'HD option' for better video quality)

    I recommend a visit to AMNH or YouTube for the 'Full Screen option', it’s wonderful:
    http://www.amnh.org/news/2009/12/the-known-universe/

    http://www.sdss.org/

    Questions anyone? :smile:
     
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. Jan 8, 2010 #2

    DevilsAvocado

    User Avatar
    Gold Member

    Edit: The Known Universe Scientifically Rendered For All to See (by AMNH)

    Wanna explore the Universe yourself?

    Well, it’s no problem! The American Museum of Natural History and the Hayden Planetarium have engaged in the 3-dimensional mapping of the Universe:

    "The Digital Universe Atlas is distributed to you via packages that contain our data products, like the Milky Way Atlas and the Extragalactic Atlas, and requires free software allowing you to explore the atlas by flying through it on your computer."

    mwM54.gif

    The package consists of The Digital Universe Atlas and Guide and the free http://virdir.ncsa.illinois.edu/partiview/" [Broken] software (industrial strength, interactive, mono- or stereoscopic viewer for 4-dimensional datasets) from the National Center for Supercomputing Applications (NCSA).

    http://www.haydenplanetarium.org/universe/download/" [Broken]

    partiview-grab.gif

    Enjoy!
     
    Last edited by a moderator: May 4, 2017
  4. Jan 10, 2010 #3

    PhanthomJay

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Quite fascinating! I do have a question:

    Our Hubbel and other space telescopes see the light from the early galaxies and CBR reaching their lenses as they existed 13.7 billion years ago. I suspect that someone 'now' in our time, living 13.7 billion light years away from us, were to peer into their telescopes, they would see the same stuff as we do: the early galaxies and CBR, correct? Which means that actually, at this instant, what we see as the CBR and early galaxies is now occupied by galaxies and planets something like ours, which we can never have any idea of knowing because the light from those present galaxies won't reach us for another x amount of light years, if ever. Is this correct?? Or does the definition of time and instantaneity and spacetime expansion etc. , mess up this logic?
     
  5. Jan 10, 2010 #4

    DevilsAvocado

    User Avatar
    Gold Member


    Yes, that’s correct (though I must emphasize – I’m only a layman). When looking back at distant galaxies we (now) see the very old light that was emitted billions of years ago. To make things a little more 'interesting' – it’s difficult to talk about a universal 'now', due to Einstein’s theory of special relativity, where space and time is not 'fixed'.

    Lorentz_transform_of_world_line.gif
    Rapidly accelerating observer moving in a 1-dimensional (straight line) "universe"

    This question has puzzled me for some years, and I thought I had all the 'basic' information needed to https://www.physicsforums.com/showthread.php?p=2508918#post2508918" (almost). :biggrin:

    Right now my brain is overheated, trying to digest all new info. My feeling though, is that it is comprehensible, if you manage to make a 'working picture' in your (layman) head... :rolleyes:
     
    Last edited by a moderator: Apr 24, 2017
  6. Jan 10, 2010 #5

    Chronos

    User Avatar
    Science Advisor
    Gold Member

    We get a distorted view of the universe. Spacetime is compressed [or expanded, if you prefer] by the Hubble flow. It is difficult to say how it 'really looks' at any universal instant in time as all such projections are model dependent. Even tiny inacurracies become exponentially exaggerated over billions of light years.
     
  7. Jan 11, 2010 #6

    Wallace

    User Avatar
    Science Advisor

    Essentially yes, you are correct. There are some complications about how you define distances, so just because the Universe is 13.7 Billion years old doesn't mean that the observable universe has a radius of 13.7 Billion light years. That's because defining distances over cosmological regions is ambigous and depends on some arbitrary definitions. The other arbitrary thing is defining when 'now' is over cosmological distances. People often forgot this important fact about relativity; that events defined as simultaneous for one observer need not be simultaneous for others. Generalised to cosmology, this means that there is no unambigous way to say what something at cosmological distances is doing 'now', since to find that out, you'd need to send a light beam there, which will take longer than the age of the Universe to arrive...

    In practice, we often make the practical definition of using the temperature of the CMB to define 'now'. This is a good natural definition, since all observers will see the same kinds of things at the same CMB temperature intervals. These days there are less Quasars around than when the CMB was a bit hotter for instance. All observers in the universe would see the same thing (on average) as their observerd CMB temperature evolves.

    So, a less complication free way of saying things might be:

    "The regions of the Universe which emmitted the CMB photons we see today will consist of planets, stars, galaxies etc in much the same way as the region around us does at the time in which an observer there would see an CMB temperature of 2.7K"

    This statement is fundamental unproveable directly, but it is a consuquence of the model for the Universe we have devised based on observations.
     
  8. Jan 11, 2010 #7

    PhanthomJay

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Thank you all!
     
  9. Jan 11, 2010 #8

    DevilsAvocado

    User Avatar
    Gold Member

    Yes, you are absolutely right. Not to talk about the 'distortion of time', 13.7 billion light years travel in only 6 minutes!? :confused:

    Besides the obvious 'trouble' with Mr. Einstein’s "Now" – there are the same 'difficulties' in projecting a 3D surface of a 'sphere' onto a 2D computer screen...

    I don’t think the aim with this video is to give a complete and correct picture of the curvature and topology of the observable universe – more like providing a feel for size and dimensions, compared to Earth.

    Just imagine what the Pope Urban VIII would have looked like if Galileo Galilei in 1633 could have showed him this video as a supplement to his book The Dialogue Concerning the Two Chief World Systems... Perhaps the Pope would have looked like something like this... :surprised
    http://upload.wikimedia.org/wikipedia/en/thumb/c/ca/Galileos_Dialogue_Title_Page.png/450px-Galileos_Dialogue_Title_Page.png [Broken]
    :rofl:
     
    Last edited by a moderator: May 4, 2017
  10. Jan 11, 2010 #9

    Wallace

    User Avatar
    Science Advisor

    Since Urban VIII and his Ilk refused to look through Galileo's telescopes (claiming what he saw could have just been imperfections in the telescope...) I suspect he would similiary dismiss this video. Either that or burn you as a the witch you must be with your magic moving painting :biggrin:
     
  11. Jan 11, 2010 #10

    DevilsAvocado

    User Avatar
    Gold Member

    Hehe! :biggrin:

    Absolutely correct conclusion. I suspect Urban VIII would claim it’s a magic bug in the system… and put both Galileo Galilei and Bill Gates + bug in the 'ovens' at max temp... :rofl:
     
  12. Jan 11, 2010 #11

    DevilsAvocado

    User Avatar
    Gold Member

    Brilliant! Thanks!
     
  13. Jan 11, 2010 #12

    DevilsAvocado

    User Avatar
    Gold Member


    I’ve been working on 'digesting' the Hubble volume, Observable Universe, c, CMB, Cosmological principle, etc, https://www.physicsforums.com/showthread.php?p=2514396#post2514396". :smile:

    According to Ned Wright – http://www.astro.ucla.edu/~wright/photons_outrun.html" [Broken] – this is how we should visualize the expansion of space, and two (originally) 'nearby' galaxies emitting photons:

    cphotons.gif

    Ned Wright:
    "However, all parts of the Universe started with CMBR photons, not just the two green galaxies. The picture below shows the result of releasing a ring of 72 red photons from every dot on the picture. It makes a pretty quilt pattern, but except for this pattern imposed by the artificial regularity of my galaxy grid this pattern of photons is homogeneous and isotropic, as specified by the cosmological principle."

    33wtaua.gif

    If we look at the video at around 3:30 we see the CMB as a sphere surrounding the very distant supernovas and distant galaxies, and finally the Earth:
    j6hifa.jpg

    Another perspective of the evolution of the (observable) universe:
    600px-CMB_Timeline75.jpg

    Now, my question is:
    We cannot see the light from distant galaxies and supernovas and at the same time see the CMB from these objects/regions, right? These CMB photons must have passed us a long time ago, right?

    If this is correct – shouldn’t there be a minor 'gap' in the CMB somewhere (in Ned Wright’s picture e.g.)...?

    (Maybe a stupid question...? :uhh:)
     
    Last edited by a moderator: May 4, 2017
  14. Jan 11, 2010 #13

    Wallace

    User Avatar
    Science Advisor

    Correct.

    Hmm, I don't really follow this, I'm not sure why you think there should be a gap?

    Say we look in the direction of some galaxy. The CMB photons that were sent from the region around that galaxy at the time of re-combination (why the CMB was sent on its way) have indeed passed by our location at some time in the past. However, we can still see CMB photons coming from that direction, but the regions where they originated from are even further away.

    Maybe this will help; if we look in any direction, then for every second that passes, we see CMB photons that originated from a location further and further away. Think of this like and long line of soldiers lined up in front of you, each a bit further from you than the rest. If they all fire their guns at you, then you'll be hit by a succession of bullets, each subsequent one originating from a location further away than the previous one.

    Note that this means observing the CMB is in principle a little different from observing a galaxy, because with a galaxy you continuously see photons from the same object, even if that object is in principle getting further and further away with each passing moment. On the other hand, when we look at the CMB, we are continually seeing radiation from different 'objects' each passing moment. In practice human lives are too short for us to measure the difference, so it doesn't make a practical difference.
     
  15. Jan 11, 2010 #14

    DevilsAvocado

    User Avatar
    Gold Member


    I don’t know how to thank you! Many many thanks!!!

    I feel this is slowly accepted by my confused brain, (earlier) lost in translation (from math). This explains CMB 'in a nutshell': "when we look at the CMB, we are continually seeing radiation from different 'objects' each passing moment"

    Cool!!

    Now, some new 'thoughts' popup: In the young universe, let’s say around formation of the first stars (400 million yrs after BB) the 'night sky' must have been extremely bright, right? A pretty hot (3000 K?) CMB (high energy photons) and a more 'compressed' universe, right?

    But then again, the chances for intelligent amoebas with eyes :bugeye: were not outstanding at this early stage, so maybe no one saw it...? :biggrin:

    Another question: The solution to Olbers paradox is the fact that the universe expands and has a finite age. Is this also the reason the fermions (matter) don’t get in the way (blocking) the CMB from more distant parts of the universe?
    (Or did I just prove that I don’t understand this at all?? :blushing:)
     
  16. Jan 11, 2010 #15
    I am sure that Wallace will correct me if I am wrong, but CMB photons are photons which are left after temperature dropped enough for stable atoms to form, and universe became transparent. So, they are not exactly radiation from different 'objects'.

    In the early opaque universe, these photons were bouncing around through Thomson scattering. When universe diluted and cooled enough for charged particles to combine to atoms, it became transparent, allowing photons to move freely. Result is that there were photons zooming in and from every possible direction, which we register today as CMB.
     
  17. Jan 12, 2010 #16

    Wallace

    User Avatar
    Science Advisor

    My bad on the use of 'objects'. You describe the origin of the CMB correctly, I used 'objects' in the 'scare quotes' in order to help explain the idea of getting the radiation from successively more distant regions as time goes on. By 'object' I meant some small region of hot gas, but I should have spelled that out to avoid any confusion!
     
  18. Jan 12, 2010 #17

    DevilsAvocado

    User Avatar
    Gold Member


    Thanks S.Vasojevic & Wallace for the clarification.

    The CMB doesn’t belong to a specific 'object' (electron/molecule/star/galaxy), but is the remaining 'glow' (everywhere) from the very hot BB.

    Any thoughts on: https://www.physicsforums.com/showthread.php?p=2527155#post2527155"

    It’s a little puzzling to me that the CMB can penetrate the 'wall' of matter surrounding us. In the http://en.wikipedia.org/wiki/Millennium_Run" [Broken] (computer N-body simulation) a cube about 2 billion light years in length is populated by about 20 million 'galaxies' (and over 10 billion 'particles' of dark matter) – it seems pretty dense... and hard for the more distant CMB to penetrate (at least I would expect some 'imprint' on the CBM as a result of this) ...?

    https://www.youtube.com/watch?v=<object width="640" height="505"><param name="movie" value="http://www.youtube.com/v/yyfpFfWq7Bc&hl=en_US&fs=1&rel=0&color1=0x006699&color2=0x54abd6&hd=1"></param><param [Broken] name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/yyfpFfWq7Bc&hl=en_US&fs=1&rel=0&color1=0x006699&color2=0x54abd6&hd=1" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="505"></embed></object>
     
    Last edited by a moderator: May 4, 2017
  19. Jan 12, 2010 #18

    Wallace

    User Avatar
    Science Advisor

    Indeed there are various 'imprints' left on the CMB by the structures in the Universe. One of the more important of these imprints are the Integrated Sachs Wolf effect, which is actually a bit like gravitational lensing on a very large scale. The existence of this is actually an important independent bit of evidence for dark energy; you only get the ISW effect in Universes with dark energy.

    Another important thing is the Sunyaev-Zel'dovich effect by which the hot gas in clusters leaves on imprint on the CMB on small scales through inverse Compton scattering (i.e. the CMB photons bouncing of the hot electrons in the cluster). We can use this in order to find clusters in the sky and hence learn about how many clusters there are and their distribution. Measuring this is pretty cutting edge, with the first SZ discovered clusters being found only the last year or two by the South Pole Telescope.

    You can find some info in detail (and some very nice animations) on Wayne Hu's site. Just google 'Wayne Hu CMB' or something like that.

    In addition to things that imprint a useful signal onto the CMB, there are also things that imprint a lot of noise that doesn't tell us anything very interesting, but makes it harder to extract just the CMB. The Milky Way is the biggest source of such noise. The process of removing all these unwanted 'foreground' is a major part of the processing of CMB data, and takes a lot of effort and clever techniques.

    The other way to think about why the CMB isn't completely blocked by 'a wall of matter' is to realise that once structures like galaxies and clusters have formed, the density contrast is the Universe is huge. Essentially you have these very dense blobs surrounding by vast regions of near vaccum. Think about our own solar system, in terms of a volume average it is almost completely dominated by near empty space. This means that even though there are a lot of galaxies etc in the Universe, they are small in size compared to the space they occupy. This means that there are plenty of free lines of site from us to the CMB they don't have anything in the way.
     
  20. Jan 13, 2010 #19

    DevilsAvocado

    User Avatar
    Gold Member


    Very interesting and amazing info, thanks!

    Here I am, a mumbling &:uhh:& rambling layman speculating on the 'properties' of CMB, and it looks like I where somehow right!? The universe is a fantastic place!! :cool:

    Confirming DE and finding unknown clusters from CMB is amazing.

    When you mentioned the Milky Way one of my sleepiest neurons woke up and said – Hey! I’ve seen this!? And of course I should have remembered (before asking questions :wink:)... this is obvious (even to me):

    081015_k_5yr_512.png

    Fantastic achievement to manage to remove this 'blob'!

    I found http://background.uchicago.edu/~whu/" [Broken] and it contains a lot of useful info, and beautiful animations. I’m especially fond of this one (by Andrey Kravtsov):

    avk_evol.gif

    A bigger version can be found http://cosmicweb.uchicago.edu/filaments.html" [Broken]. (Must be something wrong with my brain – this excites me more than any Hollywood SFX!? :biggrin:)

    Also found (Andrey Kravtsov?) http://astro.uchicago.edu/~andrey/soft/p3d/p3d.html" [Broken] where one can find more tools (for Linux).

    Then I found that PF (of course!) https://www.physicsforums.com/showthread.php?t=274265" (COBE) talking at Serious Play 2008!! This wraps it all up quite nicely. A fine supplement to the video from AMNH:

    "At Serious Play 2008, astrophysicist George Smoot shows stunning new images from deep-space surveys, and prods us to ponder how the cosmos -- with its giant webs of dark matter and mysterious gaping voids -- got built this way."

    George Smoot: The design of the universe
    https://www.youtube.com/watch?v=<object width="640" height="505"><param name="movie" value="http://www.youtube.com/v/c64Aia4XE1Y&hl=en_US&fs=1&rel=0&color1=0x006699&color2=0x54abd6"></param><param [Broken] name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/c64Aia4XE1Y&hl=en_US&fs=1&rel=0&color1=0x006699&color2=0x54abd6" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="505"></embed></object>

    http://video.ted.com/talks/podcast/GeorgeSmoot_2008P_480.mp4"


    Edit:
    For those interested in high-res videos of The Millennium Simulation (and spectacular fly-through) it can be found here:
    http://www.mpa-garching.mpg.de/galform/virgo/millennium/
     
    Last edited by a moderator: May 4, 2017
  21. Jan 14, 2010 #20

    Wallace

    User Avatar
    Science Advisor

    I'm not sure how much you'd be interested DB, but installing Linux as a duel boot option alongside Windoze is insanely easy these days with the latest disutributions of Linux. You could then use some of those tools you found that run only under Linux/Unix, if that floats your boat. I use Ubuntu, but Fedora, Suse etc are all pretty simple to use, not like in the old days where you needed to be an expert just to get it installed.

    Possibly this works for others as well, but with Ubuntu you can download and burn onto a CD an Ubuntu-Lite application which boots straight from the CD, allowing you to try out Ubuntu without having to install it on your hard drive. It's slower in this mode, but it might even be enough to let you play with some of those apps you've found.
     
  22. Jan 15, 2010 #21

    DevilsAvocado

    User Avatar
    Gold Member


    Yes, dual boot could be one option. I have http://en.wikipedia.org/wiki/VMware_Workstation" [Broken] so that will probably be even easier, but then it could be a question of speed (since VMW eats some CPU)...?

    http://upload.wikimedia.org/wikipedia/en/thumb/b/b6/VMware_Workstation.png/600px-VMware_Workstation.png [Broken]​
    [/URL]

    But, I’ve done some more 'digging' on P3D and downloaded the file http://cfcp.uchicago.edu/~andrey/soft/p3d/p3d.tar.gz" [Broken] a "device-independent graphics package for making simple scientific graphs".

    example1.gif
    "PGPLOT has been tested with UNIX (most varieties, including Linux, SunOS, Solaris, HPUX, AIX, Irix, and MacOS X/Darwin) and OpenVMS operating systems. I am unable to provide support for DOS, Microsoft Windows, but I do distribute code provided by users for use with these operating systems."

    And there is an option to run on Windows: http://www.astro.caltech.edu/~tjp/pgplot/install-windvf.html" [Broken].

    I looked some more at the code to see what’s going on (no Fortran guru). It doesn’t look like the worst 'rocket science' I’ve seen, but one thing is a little 'peculiar':
    "Demo programs use a particle distribution (particles.dat) drawn randomly (1% of all the particles in the simulation is shown) from a simulation of 5 Mpc volume of the CDM model using Adaptive Refinement Tree N-body code (simulation is by A.Kravtsov and G.Yepes).

    The input format is arbitrary and can be modified, as needed. The most important is the range of coordinate values. If particles are distributed in a cube, the code assumes that *3D particle coordinates are in the range [-0.5,0.5]* with center of the cube having coordinates {0,0,0}. This is a matter of convention, rescaling coordinates to this range is user's responsibility. Particles do not have to be distributed in a cube. The coordinates may span a larger (and different in each of 3 axis) range, if needed. They should be however centered around the point {0,0,0}."
    The particles.dat is in binary form, but is accessible thru this code:
    zlb6t5.png

    And then I saw that the P3D code is from 1998!? And PGPLOT originates from 1994!? This is old stuff?? (...and I’m not talking about the universe... :smile:)
    zlp2d0.png

    I’m confused... :confused: In the video at 15:10 George Smoot says; "So I’m going to show you one that can be run on a desktop in two days". Two days...?? Could this really be accurate?

    In the demos Kravtsov sets "parameter ( npmax = 300000 )" i.e. max 300 000 particles... And a 100 bucks mid-range graphics card can do ~ 40 gigaflops/4 billon texel /40 million polygons - per second!

    What is taking all this time?? Is it the 'gravitational calculations'? The main loop for demo1.f is this:
    2hmj72f.png

    And the only 'gravitational' I can find in the code is the (not that big) subroutine PutRedshift:
    15mct1k.png

    Can this really take two days for 300 000 particles to calculate...

    (...or is George Smoot talking about 1998 hardware performance...?)

    My guess is that this will run fast as h**l on a fresh computer with Core 2 Duo (or >) and a decent VC with OpenGL or DirectX... maybe even real-time rendering... or is the gravitational 'particle-particle interactions' going to spoil all my hope...

    It would be pretty cool if one could do a conversion and update to Andrey Kravtsov’s program, with a slick GUI with options to change the parameters for DM etc! I know that OpenGL/DirectX handles rendering fast and effective. (In the early nineties I wrote a simple '3D editor' for 'rotating points', with connecting lines, in DOS/Borland C++/Assembler. And the rendering was of course in real-time, and nowadays the hardware is extremely much faster...)

    There are three 'reservations':
    • Get out the data from particles.dat. I guess this data is crucial. Any random coordinates will not do. It has to be smooth and random, but not to smooth, right?

    • Translating the 'N-body gravitational routines' correctly; As far as I can see the 'magic' runs in the PutRedshift subroutine, but I could be wrong...? What method has Andrey Kravtsov used? The 'tree method' or the 'particle mesh method', or any other? Is it Newtonian gravity or GR?

    • I heavily underestimated the complexity of the 'particle-particle interactions', and CPU/time it takes to calculate/process. It’s a dead end not worth the time... or...?
    But then I just realized about particles.dat; "the CDM model using Adaptive Refinement Tree N-body code"!? The N-body gravitational routines (tree method) has already been calculated!? Weird? But maybe very good for the 'real-time speed'...

    Well, I wait for some clever thoughts on this, hopefully. :smile:
     
    Last edited by a moderator: May 4, 2017
  23. Jan 15, 2010 #22

    Wallace

    User Avatar
    Science Advisor

    You're in luck, I use a lot of these bits of software on a daily basis, so I think I can help out.

    So, P3D is a package for visualising the outputs of N-body simulations, not actually doing the simulations themselves. I'm not sure of what format the the input files are, but they would be the data dumps (basically a list of particle positions and any other relevant info) produced by a much larger code, which actually does the simulation.

    PGPLOT is great. Not very pretty by today's standards, but solid none the less.



    This is because the simulations use millions (or even billions) of particles, requiring many linked parallel processors just to keep in memory. If you made that into a graphic file it would be far too big. Therefore when doing visualisations, you first then the distribution to a manageable number of particles but which still trace the same mass distribution.

    Yep, simulations have gotten bigger, but they still just produce a distribution of particles. Therefore a tool like this doesn't go out of date. That being said, there is a very nice 3D package called S2PLOT (google it) which uses modern graphics capabilities to make some much sexier pictures. The old stuff like P3D still works fine for research purposes (i.e. looking at your simulation to get a quick visual check all is well).

    Again, I think you are missing a step. I'm not sure what code they are using to do the simulations, but it's not P3D, that just does visualisations of data file made by a simulation, it doesn't actually do any physics.

    As for run time, think of it like this. If you have N particles, then to know the gravitational interactions you need to find the vector components of Newtons law of gravity between every pair of particles, therefore it takes N! (N factorial) such operations every time step. For large N, this becomes a very very very big number. In practice in cosmological codes there are clever algorithms that reduce the required computation time enormously (at the cost of some accuracy) compared to this most simple 'direct force' approach, but it is still hard work. Compared to say computer games that use graphics cards to implement physics for particles tracks (to model explosions etc) the reason that are so much faster is that there is no self gravity between the particles; they are all just being accelerated by a uniform background field. It's much harder when the gravity field itself changes as the particles move!

    Exactly how long a simulation takes depends on more than the number of particles, because of how these algorithms work. I can believe that a 300,000 particle sim could take 2 days on a high end modern desktop. A typical simulation I run used about 16 Million particles, runs on 8 or 16 parallel linked processors (effectively that many high end desktops) and takes about a day. The biggest I've run used about half a billion particles, ran over 256 processors and took about 3 days. That's still small fry compared to the really big end of town though, were they run simulations using many billions of particles that can run for weeks or months.

    You could probably do something in realtime like this, but the number of particles would be much smaller. Cosmological simulations would probably take too long, but you do galaxy collisions like this. I remember someone (another N-body guy) told me at a conference a few months ago that they set up something like this for an Uni open day. They managed to hook up a Nintendo Wii such that you could use the 'wand' control to grab a galaxy and throw it at another one and see the results (i.e. a big cosmic train wreck!) in realtime. Very cool!

    Remember that you would need to get an actual simulation code to do this. The one I use is GADGET-2, which is available free (google it, the author is Volker Springel). It is written in C.

    This is an important point. All cosmological codes use only Newtonian gravity. The background FRW solution is put it 'by hand', which is the only place that GR comes into it. Some people insist that this is a real problem, and a full GR solution could be different, possibly so different as to actually 'explain away' the need for dark energy in the model. I don't think they are right, but I don't fully discount the possibility. It does remain an important caveat to be aware of.
     
    Last edited by a moderator: May 4, 2017
  24. Jan 15, 2010 #23

    DevilsAvocado

    User Avatar
    Gold Member

    Oh man! Thaaaanks! I send U a 'virtual' Single Malt!
    1033614x.jpg
    This explains a lot! And I must laugh at myself! The subroutine PutRedshift is NOT a 'gravitational function', it’s the numbering for Z at the top-left in the animation!! Hahaha LOL!!! :rofl:

    Okay, so this is how it works. You fill the file particles.dat with pre-calculated goodies, and when you run the animation you just pick the coordinates {x, y, z, ex properties}, right?

    If I understand this correctly – there is not much you can alter once the data-file is generated, right? One cool thing you could do, is making the rendered particles available for user input in terms of (Z)time (forth/back/speed) and overall rotation & size, right?

    There’s still one thing that puzzles me (and shows that I don’t understand this 100%). The file particles.dat is 480 024 bytes in size... and if we have 300 000 particles in there with at least 3 values each {x, y, z} and one value needs to be an (2 byte) integer(?) that’s 2 x 3 x 300 000 = 1 800 000 bytes!? Do they use heavy compression on the data-file, or what?

    (Edit: The data type is of course REAL and is usually 4 bytes long, makes things even 'worse'...)

    Another proof that I don’t understand: If we look at the 'life' of one particle from Z=28.62 to Z=0 there’s a lot more {x, y, z} before it reaches its 'final destination'... where is this data gathered...?

    Of course Nobel laureate George Smoot IS right! Shame on me!! "Demo programs use a particle distribution (particles.dat) drawn randomly (1% of all the particles in the simulation is shown)", i.e. 300 000 x 100 = 30 million particles ≈ 2 day simulation!! Sorry!!

    I have to check out your tips, and come back. Thanks again! Cheers!
     
    Last edited: Jan 15, 2010
  25. Jan 16, 2010 #24

    Wallace

    User Avatar
    Science Advisor

    A few points that should clear some of your confusion up.

    Simulations snapshots like this tend to be at a single redshift, rather than containing the whole history of every particle. So you would normally have say one file for z=0, another for z=1 etc. Actually, usually you have multiple files for each redshift when doing actual research sims (as opposed to something for a demo) since there are so many particles the files would be too big. Plus when you do a parallel simulation, it is much faster for each processor to write it's own 'part' of the simulation volume to a file, rather than trying to get all of them to communicate and channel Gigabytes of info through a head node and into a single file.

    As for the file size, I would suspect it is more like 300,000 particles in the sim, of which 1%, so 3,000, are present in the file used for visualisation. That to me would be consistent will the file size you mentioned, and 2 days to run a 300,000 particle sim on a modern desktop doesn't sound too wrong, but it really depends on other factors that aren't mentioned (such as the simulation volume, initial reshift etc). The more 'smooth' the particles distribution the faster a sim will run, and how inhomogenous your box is depends on the box size (small boxes will be less smooth than if you are modelling larger scales).
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook