# Why not fewer visible stars in far away galaxies?

1. Jun 18, 2008

### helix_angle

Hello,
I have a question concerning general relativity. (go easy - I have only knowledge from general consumption physics books). Something curious I have been thinking about requires an explanation as to where I am incorrect in my thinking:

My understanding is that, as something moves relative to an observer, the speed at which it moves constitutes its energy, and therefore its mass. Since mass curves space-time, the amount of curvature an object causes (i.e. the gravitation pull caused by its mass) must be also be relative to the observer. This seems to me absolutely bizarre - but pressing on:

Assuming the above is correct then: At some point approaching the speed of light the mass an object relative to an observer must reach the point where photons reflected from its surface can no longer escape its mass, causing it to appear to the observer to be a black hole (invisible), where for another observer moving with the object it would appear perfectly normal.

Then would it also not be true that what we indirectly detect to be black holes in galaxies far away that are rushing away from us (the further the faster with accelerating space-time expansion) may actually not be black holes but normal (if large) stars relative to an observer in the same galaxy? Therefore the number of "visible" stars in galaxies far enough away should appear to be less and less, but I haven't read anywhere that this is the case. Why?

Thanks,
Helix_Angle

2. Jun 18, 2008

### Staff: Mentor

No, when objects recede from us, they emit the same number of photons - the only change is in the frequency (they are red-shifted). Now due to the inverse square law, we see more distant objects as dimmer, but only as a function of distance, not as a function of recession speed.

3. Jun 18, 2008

### helix_angle

you didn't read my question properly. I was asking whether general relativity provides that at some point approaching the speed of light the mass an object relative to an observer must reach the point where photons reflected from its surface can no longer escape its mass, causing it to appear to the observer to be a black hole.
If this is not true, can someone explain why?

Thanks,
Helix_angle

4. Jun 18, 2008

### Mentz114

Helix_angle,
if this were to happen it would mean that two observers might disagree about what they are seeing, one saying BH the other saying it's a normal mass. The apparent increase in mass due to relative velocity does not increase the perceived gravitational pull. Ignoring black holes, it would mean that one observer might see tall buildings collapse under their own weight and another observer seeing them standing.

I can't prove it but I would say nature rules out such chaos.

M

5. Jun 18, 2008

### Staff: Mentor

Ok, I think I mixed-up SR and GR -- because your question seems to. Recession velocity due to the expansion of the universe (a consequence of GR) has no associated special relativistic effects like mass increase. It isn't a "real" speed - it just looks like objects are moving because the space between them is increasing.

Now on the related question of whether you could give an object enough energy to turn it into a black hole, I'm not so sure -- someone with a deeper understanding of SR and GR might be able to take that.

Last edited: Jun 19, 2008
6. Jun 18, 2008

### helix_angle

ah - of course. The recession due to spacetime expansion is not a relitavistic speed as I was imagining it. This makes sense.
I'd be interested if someone has an answer to the black hole question though. It was mentioned that they may be creating 'mini black holes' in the soon to be online LHC. I wonder if these are caused simply by the enormous velocity of the accelerated particles as I described above, or something to do with the energy involved in the collision?

Thanks for your help,
Helix_Angle

7. Jun 19, 2008

### yuiop

Imagine a planet with about the the same size and mass of the Earth is whizzing past you at a relativistic velocity such the clocks on that planet are slowed down by a factor of 2 by time dilation. On Earth a object dropped from a height of about 10 meters takes about one second to hit the floor. A person standing on the top of the planet dropping a similar object from the same height would measure the time for the object to fall as half a second according to his clock if all else is equal. length contraction is not involved when we consider distance orthogonal to the motion. By his calculations the acceleration of gravity would be 4 times as fast as on the the Earth. now SR tells us the laws of physics are the same in all reference frames so the object must fall slower. That tells you that the gravitational effect is reduced, not increased by relative motion. The inertial mass of the object is increasing but not its gravitational mass. SR tells us that force perpendicular to the motion is reduced by gamma and this applies to gravity too. So the gravitational force acting on the object is halved and the inertial mass of the object has doubled. F=ma so a' =F'/m' = (f/2)/(m*2) = a/4 which agrees with the value I gave earlier. Hopefully the argument given here will convince you that while inertial mass can be viewed as increasing with relative motion it is not true that the gravitaional mass is increasing. In fact it is reducing.

So the answer to your last question about creating mini black holes in a collider is that they will be created by the energy of collisions rather than by apparent increase of relativistic mass.

Last edited: Jun 19, 2008
8. Jun 19, 2008

### D H

Staff Emeritus
9. Jun 19, 2008

### helix_angle

Thanks Kev - that explains things very well.
Cheers,
Helix_Angle