A meter sticks moves with velocity 0.60c relative to an observer. The observer measures the length of the meter stick to be L. The problem states that 0.80<L<1.0 m must always be true. So far, I have determined that Gamma = 1/sqrt(1-0.60c^2/c^2) = 0.8. What I don't understand is why there is a range. If the meter stick is 1 m when it is not moving and 0.8m when it is moving at 0.80c, why is there a range for the length?