Hello! I have a small question, in my textbook and everywhere I look online it states that to find the redshift of a receding galaxy from the spectrograph you divide the change in wavelength by the original point you are measuring from. The problem I have is, if there is a 20nm shift for example across the whole graph then you get different results depending on which peaks you decide to take the measure from. For instance if you had 420nm > 440nm the redshift would be 0.047 3.d.p but if you used another peak at 620nm shifting to 640nm you would get a redshift of 0.032 3.d.p Completely different figures that would alter any future calculations such as recession speed but for the same redshift. I'm sure that as usual I'm missing something obvious, but after much searching I can still not work out what it is.. Can anyone help please? Thanks!