1. The problem statement, all variables and given/known data How fast (as a percentage of light speed) would a star have to be moving so that the frequency of the light we receive from it is 11.0% higher than the frequency of the light it is emitting? 2. Relevant equations (f_L)= (f_s)*(v+v_L)/(v+v_s) 3. The attempt at a solution Okay, so, if the frequency of light we receive is 11% higher than emitted, then that means that (f_L)/(f_s) = 1.11, right? I know that for the frequency of the star, which is the source in this case, to be less than the frequency we receive, it has to be moving toward us. So, the sign on v_s is negative. The earth's speed is really, really small relative to the speed of the star, so the listener (the Earth) is effectively 0 in this case, right? So, I had: 1.11 = v/(v-v_s) I calculated v_s to be 0.099*c, where c is the speed of light (3.00*10^8 m/s). I also used the speed of light for v in the above calculations. The thing is, it tells me that my answer is close, but not close enough to count as correct. So, my thoughts were: a) do we have to take the earth's speed into account (and if so, how?), or b) is there some facet that I'm missing (like, is there some sort of reflection of the wave that I have to take into account)? Thank you!