1. The problem statement, all variables and given/known data Light with a frequency of 6.1(1014) Hz is measured from a star known to produce light with a frequency of 6.0(1014). How fast is the star moving toward or away from earth? 2. Relevant equations f = f0(v + vo)/(v-vs. f=observed frequency f0 = original frequency v=wave speed vo = wave of observer vs=wave of source 3. The attempt at a solution 6.1(1014) = 6.0(1014)(3.0x108 + 0)/(3.0x108 + x) -> (3.0x108 + x) = 6.0(1014)(3.0x108 + 0)/6.1(1014) -> x = ( 6/6.1 x (3.0(108) ) - 3.0x108 -> -4918032.787 m/s Moving away at 4918032.787 m/s I feel that I'm doing it wrong since shouldn't it be moving toward earth since the frequency observed is higher?