Hi, my question stems from Zwiebach's book - A First Course in String Theory, in specific, the chapter on "special relativity and extra dimensions". He introduces light-cone coordinates as follows: x+ = 1/sqrt(2) (x0 + x1) ....... (1) x- = 1/sqrt(2) (x0 - x1) ........ (2) Here x^mu = (x0, x1) = (ct, x) He goes on to say that: "For a beam of light moving in the positive x1 direction, we have x1 = ct = x0, and thus x− = 0. The line x− = 0 is, by definition, the x+ axis. For a beam of light moving in the negative x1 direction, we have x1 = −ct = −x0, and thus x+ = 0. This corresponds to the x− axis. The x± axes are lines at 45◦ with respect to the x0, x1 axes." What I don't understand is that in his explanation, it looks like he is treating equations (1) and (2) as purely scalar equations e.g. for the x1 = ct = x0 case, we get x- = 0 from eq (2). But then again, (1) and (2) cannot represent a coordinate system as there is no inherent directionality presented - these are just scalar quantities. So I begin to think that (1) and (2) should be representing unit vector quantities (note the 1/sqrt(2)). In this case you cannot follow his argument and get x- = 0 by setting x1 = ct = x0. It will instead result in a non-zero vector. I am confused as to the logic steps he is taking here - scalars or vectors.... Any help most appreciated. V.