# LIGO discovery on gravitational waves.

Tags:
1. Feb 12, 2016

### Saanchi

Could someone please describe the recent LIGO project, gravitational waves and their detection in simple terms. PLEASE.

2. Feb 12, 2016

### e.bar.goum

There are quite literally hundreds of news articles aiming to explain the detection to people with no knowledge of the subject. Here are a few randomly selected ones. Searching "Gravitational Waves" will bring up many more:

http://www.abc.net.au/news/2016-02-12/first-direct-evidence-of-gravitational-waves-detected/7140750
http://www.bbc.com/news/science-environment-35524440
http://www.nytimes.com/2016/02/12/science/ligo-gravitational-waves-black-holes-einstein.html

You should read a few and come back with some more focused questions.

3. Feb 12, 2016

### fizzy

http://news.cnrs.fr/articles/gravitational-waves-detected
<<
Benoît Mours, scientific director at the LAPP1 and principal investigator of the Virgo project in France. “According to our verifications, random noise in the form of GW150914 is so unlikely that it would only happen once every 200,000 years.”
>>

Those kind of odds don't really mean anything in isolation.

How many such annihilation events were happening 1.5bn years ago? What are the odds of us looking at just the right to witness it as the wave shot through the Earth?

To look at the odds the other way around: the chances that we would start looking just as such an event flew past us are so unlikely it must be noise.

4. Feb 12, 2016

### Staff: Mentor

That sounds like the lottery fallacy: you should not ask what is the probability that someone in particular won the lottery, but what is the probability that anyone won the lottery. You have to compare the probability given with the probability of detecting any kind of gravitational wave event.

5. Feb 12, 2016

### Orodruin

Staff Emeritus
But this is the beauty. We do not necessarily know the rate of these events, but as we have discovered one of them we can use it to estimate the rate. Since statistics is low it will be a rather bad estimate, but it will exclude a zero rate at high confidence.

6. Feb 12, 2016

### fizzy

Circular logic. We've know we've detected one since the chances of this being random are very small ... therefore we can calculate how many there are .... therefore we can work out what the chances of it happen by chance are ....

7. Feb 12, 2016

### fizzy

The lotto fallacy is that everyone has a gravity wave detector and someone detects a gravity wave every Saturday night when the lucky draw is held !

8. Feb 12, 2016

### Orodruin

Staff Emeritus
No it is not. We know that there is a rate, just not what it is. We can use the experiment to constrain that rate. This is how science works.

What you do not seem to understand is that it is possible to compute the rate of such an event occurring by chance without knowing the actual signal rate - what you need to understand to do it is the noise rate. The 200000 years refers to the time you would have to run the experiment until you would expect to have had a random fluctuation such as this one without there being a signal rate at all. This is why we can now rule out zero signal rate.

9. Feb 12, 2016

### amjc

The first scientific who predicted the gravitational waves was not Albert Einstein (in 1916), as most journalists have reported. The French mathematician Jules-Henri Poincaré mentioned these waves in June 1905, for the first time, and he called them “ondes gravifiques”. See page 1507 of the paper titled “Sur la dynamique de l’électron”, in http://web.archive.org/web/20050127...hysis/HistoricPaper/Poincare/Poincare1905.pdf

Poincaré had created the special theory of relativity before Einstein, during the period 1895-1905, and he also investigated some other concepts that can be found in general relativity. This theory was generally known as the “Relativity of Poincaré and Lorentz” in those years. See http://www.brera.unimi.it/sisfa/atti/1998/Giannetto.pdf

10. Feb 12, 2016

### Staff: Mentor

Or at least we can be highly confident.

@fizzy: if you don't like the frequentist approach, you can also take the Bayesian approach. "There are no binary black hole mergers" is still possible (and we can never rule it out with absolute certainty...), but compared to "on average there is one binary black hole merger in the detectable range per year" it got less likely by a factor of 200,000, compared to "on average there are 20 mergers per year" the factor is about 4 millions. If LIGO finds another event over the course of a year, the factor of 200,000 will increase to rougly 20 billions.
You would need massive prior evidence that mergers should be extremely rare to compete against those numbers.

No circular logic involved. The estimate of the merger rate does not tell you the probability that you actually observe mergers, but the probability that noise fakes one signal in such a short time is tiny, and it will get even smaller with more observations.

11. Feb 12, 2016

### fizzy

Life on earth has taken about 1.5 bn years to get to be able to detect these events and in the first couple of months of running the thing we find one. So either the universe is full of black-holes continually annihilating each other and this is to be expected or it the odds of happening are about as improbable as the small chance this was noise.

So we have one very unlikely occurrence to weight against another very unlikely occurrence. I don't see that proves either to be more probable explanation with out more information.

12. Feb 12, 2016

### Orodruin

Staff Emeritus
I would say this is how we use "rule out" in colloquial speech.

13. Feb 12, 2016

### Orodruin

Staff Emeritus
Which also rules out the possibility that the universe has a low rate of black hole mergers in favour of a higher rate. There is no inconsistency here.

14. Feb 12, 2016

### fizzy

That would seem like a more logical conclusion. Perhaps 95% of the mass of the universe is black holes popping each other off all the time.

The short observation time has to affect both probabilities. If it makes it very unlikely that it's noise it has also to make it very unlikely we were looking at just the right time to see rare event. So either it is a statistical fluke ( either one ) or these events are happening all the time.

My initial point is that just stating one of the improbable events out of context of the other gives a false impression.

That leaves:
1. It was a very unlikely noise event
2. It was a very unlike time to reach scientific capability and WOW they got lucky straight away
3. The universe has a very high rate of such events.

If it's the latter we should know pretty soon.

Last edited: Feb 12, 2016
15. Feb 12, 2016

### Orodruin

Staff Emeritus
There is only one probability at work here. The noise rate is well known as it can be computed from what we know about the detector systems and therefore we can make deductions about the signal rate, which we did not know previously.

Nobody has ever claimed the signal to be improbable, it only is improbable if the signal rate is low. The only claim is that a background event is highly improbable - once every 200000 years. Since you have an event, this lets you constrain the signal rate as the total rate is signal+background. Nobody has stated anything different - I do not think there are any false impressions given unless you misinterpret the quote.

16. Feb 12, 2016

### fizzy

OK, so either they got very lucky or the event rate is high, ie we are in agreement with 1,2,3 above.

17. Feb 12, 2016

### Staff: Mentor

Well, a rate of a few mergers per year is not much, if you consider that of the order of 100 supernovae per second (!) happen in the observable universe.
No, certainly not. Where does that number of 95% come from? Please don't make up numbers.

One event in 16 days makes an event rate of at least 1 per year very probable - the most likely rate (frequentist approach, using only the direct observation) would be about 22 per year. We can rule out rates of several hundred per year (because LIGO would probably have seen more events), and rates of less than one per decade (because then the observation of an event in 16 days would be very unlikely). Anything in between is up to the confidence level you want to consider.

18. Feb 12, 2016

### fizzy

Thanks. The 95% was a tongue in cheek reference to the amount of missing mass, not a serous number.

The 22 per year to one per decade you suggest seems reasonably argued , thanks.

19. Feb 17, 2016

### the original Bulk

G-waves are not part of the EM spectrum. BHs, by definition, do not allow ANY ElectroMagnetic waves to exit the Black Hole. The only way to detect BHs is through their effects on other bodies or through the stretching and squeezing of spacetime itself. Gravity and G-waves are a spacetime phenomenon, and as such, will reveal a tremendous amount of information about, for example, the first moments of the big bang, before atomic particles and forces differentiated into the ones we know today. We may be able to push back our knowledge of the early universe to 1x10^-43 second, or the instant the "universal atom" and spacetime itself began its expansion.

Last edited: Feb 17, 2016