- #1
travis51
- 9
- 0
So I am trying to understand some of the ideas of relativity but there is one thing i don't get. I understand that if one object were sent at .99 c and from that object another were launched at .99c the object would appear to be at .99c for the moving object but below the speed of light for an outside observer because time is just slowed down for the first bullet so it appears to be at .99c for the bullet but just below c and above .99c for and outside observer. The thing i do not understand is why if two objects were moving in opposite directions near the speed of light an observer on one of the objects would see the other going slower than the speed of light. Shouldn't it appear much faster because time is slowed in the fast object. Or does it have to do with the time dilation factor on the object. Also if object A and B were going at .99c towards each other why would an observer see it below the speed of light. I just want and explanation, no math or questioning the speed of the objects please because that is the easy way out.
I've heard that in the time, time would be slowed so much on the too intersecting that it would be impossible to measure, does this make sense? It sounds wrong to me, what if the same scenerio happened with two objects of speeds just above .5c
I've heard that in the time, time would be slowed so much on the too intersecting that it would be impossible to measure, does this make sense? It sounds wrong to me, what if the same scenerio happened with two objects of speeds just above .5c
Last edited: