Finding a vector orthogonal to others

  1. I'm sure there's a simple way of doing this but I just can't think of it.

    In R^n, given k<n linearly independent vectors, how do you find a vector that is orthogonal to all given vectors.

    I know cross product works for R^3, but what about R^n?
     
  2. jcsd
  3. Hurkyl

    Hurkyl 16,090
    Staff Emeritus
    Science Advisor
    Gold Member

    I assume you're taking a linear algebra course. Why not apply the techniques you've learned to solve the equation? Or... is it that you haven't even tried to convert the problem into an algebraic equation?
     
  4. Actually I have tried that.

    Suppose we're working in R^4. And we're given two vectors (1,0,1,0) and (0,-2,-1,1). Now I want to find (a,b,c,d) that is perpendicular to both.

    So the equations I have are
    <(1,0,1,0)|(a,b,c,d)>=0
    <(0,-2,-1,1)|(a,b,c,d)>=0

    The inner product here is the dot product.

    This gives me 2 linear equation, when I need to solve for 4 variables which isn't enough. Any help in finding another 2 equation so the system can be solved?
     
  5. Hurkyl

    Hurkyl 16,090
    Staff Emeritus
    Science Advisor
    Gold Member

    What do you mean? You know full well how to find the complete solution to any system of linear equations, no matter how many variables and equations there happen to be.

    If, for whatever reason, you really and truly must have a square coefficient matrix, then you can always add the equations 0 = 0 and 0 = 0. :tongue: (Or replicate your other equations)
     
    Last edited: Nov 1, 2008
  6. HallsofIvy

    HallsofIvy 40,218
    Staff Emeritus
    Science Advisor

    Yes, if you are working in 4 dimensional space, the set of all vectors orthogonal to 2 given vectors is a 2 dimensional subspace, just as in 3 dimensions, the set of all vectors orthogonal to a single given vector is a 2 dimensional subpace. In n dimensions you have to give n-1 vectors before you can find a single vector such that all orthogonal vectors are multiples of that (a one-dimensional subspace).

    In this example, you have equations a+ c= 0 and -2b- c+ d= 0. We can solve two (independent) equations in four variables for 2 of the variables in terms of the othe two. Here, from the first equation, c= -a. Putting that into the second equation -2b+ a+ d= 0 so d= -a+ 2b. Choose whatever numbers you like for a and b and you will get a vector orthogonal to the original two. In particular, if you take a=1, b= 0, you get c= -1, d= -1: the vector (1, 0, -1, -1) is orthogonal to the two given vectors. If, instead, you take a= 0, b= 1, you get c= 0 and d= 2: the vector (0, 1, 0, 2) is also orthogonal to the two given vectors. In fact, the set of all vectors orthogonal to the two given vectors is the subspace spanned by {(1,0,-1,-1), (0,1,0,2)}.
     
  7. I have a way to get an ortoghonal vector for any two vectors in any dimension larger or equal to 3. It actually does not involve finding the entire space of possible orthogonal vectors.

    It must be that this method (nD cross product) gives you a special member of the possible orthogonal vectors because the product satisfies the requirements for a Lie Algebra and most of the other properties of the 3D cross product.

    I will post it somwhere (please sugest where to).:bugeye:
     
  8. HallsofIvy

    HallsofIvy 40,218
    Staff Emeritus
    Science Advisor

    Do you really want us to tell you where to put it?:rofl:

    Is this the usual product of n vectors with the alternating tensor?
     
  9. Yes because I cannot retype it using TeX on the forum while online. Alternatively please advise where I can get a TeX editor.

    No it is not a product of n vectors just of 2 in any dimension >= 3.

    I'm not sure about the alternating tensor definition. Please direct me to the definition. I read Algebraic Geometry and it fits in with the determinant expansion of the cross product.
     
  10. HallsofIvy

    HallsofIvy 40,218
    Staff Emeritus
    Science Advisor

    The alteranating tensor (which is not, I think, a true tensor) is [itex]\epsilon_{i1i2i3...in}[/itex] equal to 1 if [itex]i1i2i3...in[/itex] is an even permutation of [itex]123...n[/itex], -1 if an odd permutation and 0 in all other cases (basically if any index appears twice. In three dimensions the cross product can be writen as
    [tex](u\times v)_k= \sum_{i=1}^n\sum_{j=1}^n\epsilon_{ijk}u_iv_j[/tex]
     
    Last edited: Nov 7, 2008
  11. I don't understand the

    [tex]\sum[/tex] j.

    What is the 1[tex]^{n}[/tex] ?

    That definition cannot be extended directly to 4D and larger because it will look like:

    (u x v)[tex]_{k} = (-1)^{k} \epsilon_{i j m k} u_{i} v_{j}[/tex]

    in 4D. In which case you don't know what to do with the m index. I'm not sure about the (-1)[tex]^{k}[/tex] now because the tensor goes to (-1) or 1.

    My definition in the 4D case fits in with this. You make the tables of even permutations of three indexes and select adjacent pairs. For k = 1 this gives:

    23 | 42
    __| 34

    which is the pairs that go into i, j (in this order) in [tex]u_{i} v_{j}[/tex] and in reverse order into [tex]- u_{i} v_{j}[/tex]. Then you get a sum of 6 terms for k=1.

    For k = 2 you need odd permutations which are even permutations and one transposition. Thus for k = 2 you need transpositions of:

    13 | 41
    __| 34

    Note this is: 1 replace 2 in the previous triangle.

    Transpose this:

    31 | 14
    __| 43

    This goes into i, j like above.

    Similarly for k = 3:

    12 | 41
    __| 24

    For k = 4 transpositions of:

    12 | 31
    ___| 23

    equals:

    21 | 13
    | 32

    The nD version uses extensions of these triangles.

    I will attach a doc file soon.
     
  12. Here is one version of the nD cross product of two vectors (see attachment).

    The properties are not proved in this version, it will be posted later.
     

    Attached Files:

    Last edited: Nov 7, 2008
  13. HallsofIvy

    HallsofIvy 40,218
    Staff Emeritus
    Science Advisor

    That was a typo. I have editted it.

    Of course it can't. The crucial property of a cross product of two vectors, in 3 dimensions, is that it is orthogonal to the both vectors, and spans the space of all vectors orthogonal to both. In 4 dimensions, the space of all vectors orthogonal to a given two vectors has dimension 2 and cannot be spanned by a single vector. To get a single vector, you need to "cross multiply" THREE vectors:
    [tex](\vec{u}\times\vec{v}\times\vec{w})k= \sum_h\sum_i\sum_k e_{ijkh}u_iv_jw_k[/itex]

     
  14. I stated that it gives a special vector orthogonal to both. I have the proof of this. I will post it soon.

    It doesn't span 2D or larger but the product isn't useless or obsolete since it leads to non square determinants - look at the attachment and you'll see. Call it something else then like: "nD Shift Permutation Vector Product."
     
  15. Actually it looks like the table does not work (as the construction specifies) for odd dimension because you have [1]x[2] not = - [2]x[1] when using the table in 5D.

    [1]x[2] = +
    [1]x[3] = -
    [1]x[4] = +
    [1]x[5] = -
    [2]x[1] = +

    But the number triangle and formula does satisfy orthogonality and anti-commutivity. In my terminology this vector product is:

    u x v = [k] (-1)^(k+1) { uIO_nm u_n v_m } ( 1 )

    with the same number triangles and Einstein summation assmed. The orthogonality property is proved using the 3xn determinant expression for the vector product using "constant multiple of rows", "sum of binomials" and "proportionality of rows" properties (in this order).
     
  16. daniel_i_l

    daniel_i_l 866
    Gold Member

    In R^n, any set of n vectors {v_1,...,v_n} can be used to obtain an orthogonal set {u_1,...,u_n} so that for all k Sp{u_1,...,u_k} = Sp{v_1,...,v_k}. One algorithm to to this is the Gram Schmidt process:
    http://en.wikipedia.org/wiki/Gram–Schmidt_process
    It should be apparent from the process hoe to solve your problem.
     
  17. HallsofIvy

    HallsofIvy 40,218
    Staff Emeritus
    Science Advisor

    That's not true. The original set of {v_1, ..., v_n} vectors have to be an basis for R^n.

    What problem are you talking about?

    The original problem was "In R^n, given k<n linearly independent vectors, how do you find a vector that is orthogonal to all given vectors."

    If you add n-k independent vectors to the given k vectors to make a basis for R^n and then apply Gram-Schmidt, the vectors {u_1,..., u_n} are not necessarily orthogonal to the original set of k vectors so does NOT solve the problem.
     
    Last edited: Nov 18, 2008
  18. Why not? The given set of k vectors and the set obtained by applying Gram-Schmidt have the same span.
     
  19. Error in attachment above:

    The reduction rule was left out. It states that the 2xp matrices with deleted columns such that two columns remain can be reduced by leaving out the deleted columns after columns have been transposed such that all the deleted entries are at rightmost and the rest of the indices are still in < order from left to right (and the term multiplied by -1 if an odd amount of transpositions was required).

    For px2 matrices with deleted rows the deleted entries must be at bottommost (and similar multiplication is required).
     
  20. Fredrik

    Fredrik 10,153
    Staff Emeritus
    Science Advisor
    Gold Member

    The relevance of Gram-Schmidt here is that it's based on the same trick that the OP needs to use. E.g. when you have found two of the vectors in the orthonormal basis you're constructing, Gram-Schmidt tells you how to take a vector that isn't in the subspace spanned by the first two basis vectors and use the three vectors to construct a third basis vector that's orthogonal to the first two.

    If you want to find a vector that's orthogonal to the subspace U of the vector space V, then pick any vector x in V and let y be its projection onto U. x-y is orthogonal to U.
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook