Giant ropes of dark matter found in new sky survey

AI Thread Summary
Huge filaments of dark matter have been discovered in a survey of thousands of distant galaxies, reinforcing the theory that dark matter is crucial for galaxy formation and cosmic structure development. This finding addresses previous inconsistencies regarding the universe's dark matter content. The ongoing survey has covered an area approximately 300 times larger than the Full Moon, utilizing data from the MegaCam on the Canada France Hawaii Telescope. Led by Liping Fu from the Institute of Astrophysics in Paris, this research highlights the significance of dark matter in understanding the universe. The implications of these findings may extend to computational simulations and neural network-like structures in astrophysics.
SF
Huge filaments of dark matter have been detected in a survey of thousands of distant galaxies. The discovery supports the idea that dark matter drove the formation of galaxies and larger cosmic structures and resolves a discrepancy in previous studies about how much dark matter the universe contains.

The survey, which is still ongoing, has already covered an area of the sky around 300 times the size of the Full Moon. Astronomers led by Liping Fu of the Institute of Astrophysics in Paris, France, have analysed data gathered by the 340-megapixel MegaCam – the largest astronomical camera in the world – attached to the 3.6-metre Canada France Hawaii Telescope (CFHT) in Hawaii, US.

http://space.newscientist.com/artic...s-of-dark-matter-found-in-new-sky-survey.html
 
Space news on Phys.org
I was over at the Max Plank Institute website reading about this and their Calculated Luminosity Function. Elsewhere, quite a few people have made the same coincidental speculation about the neural network-like structure form.

From a computation simulation point of view, neural networks are often generated by an underlying cellular automata machine architecture.
 
https://en.wikipedia.org/wiki/Recombination_(cosmology) Was a matter density right after the decoupling low enough to consider the vacuum as the actual vacuum, and not the medium through which the light propagates with the speed lower than ##({\epsilon_0\mu_0})^{-1/2}##? I'm asking this in context of the calculation of the observable universe radius, where the time integral of the inverse of the scale factor is multiplied by the constant speed of light ##c##.
Why was the Hubble constant assumed to be decreasing and slowing down (decelerating) the expansion rate of the Universe, while at the same time Dark Energy is presumably accelerating the expansion? And to thicken the plot. recent news from NASA indicates that the Hubble constant is now increasing. Can you clarify this enigma? Also., if the Hubble constant eventually decreases, why is there a lower limit to its value?

Similar threads

Back
Top