Something has me puzzled about the theory of relativity. At time 0, a photon is emitted from the origin of a rectangular coordinate system. At time t, the photon is at position x, on the positive x axis. Therefore in amount of time t-0=t, the photon has traveled a distance of x. The speed of the photon in this frame is by definition: x/t Let c denote the speed of the photon in this frame. Thus c=x/t Now, consider things from the photon's point of view. Let time in this system be represented by t`. Let the origin of both frames overlap when t`=0. When t`=0 let the origin of the other frame begin to move onto this systems positive x` axis. Thus, the speed of the origin of the other system in this frame is: x`/t` And speed is relative hence x/t=x`/t` Suppose that the Lorentz transformation is correct. Thus, x`= x (1-v^2/c^2)^1/2. That leads to the following result: 1/t = (1-v^2/c^2)^1/2 divided by t` But since v=c, it follows that (1-v^2/c^2)^1/2=0, from which it follows that 1/t = 0 from which it follows that 1=0, which is false. Thus, the Lorentz transformations are invalid. Where is my mistake?