Consider this, and tell me where my thinking goes wrong. I'm confused. There is a person observing a spaceship launch from earth, and another person on the spaceship. There are space rocks and asteroids stationary with respect to the earth for the spaceship to observe as it flies. The person on earth watches as the spaceship quickly accelerates to 0.95% the speed of light. The person on earth watches the spaceship go to alpha centauri which we'll say is 4 lightyears away. So it takes 4 lightyears / 0.95 = 4.2 years for him to see the spaceship get there. However, on the spaceship going that fraction of the speed of light, time goes slower and it only takes him 1.24 years in his reference frame to get there. So, does that mean he'll feel like he's going 4 lightyears/1.24 years = 3.2 times the speed of light? For example, will he see adjacent objects like asteroids and space rocks fly by him at 3.2 times the speed of light? He knows that alpha centauri is 4 light years away but he'll get there in 1.24 years. Surely I'd be confused if I calculated I went faster than the speed of light...... Where is my thought process going wrong? Or is this what would actually happen to someone travelling that quickly?