Finding a vector orthogonal to others

In summary, the conversation discusses finding a vector that is orthogonal to a set of given vectors in n-dimensional space. Methods such as the cross product and linear algebra techniques are suggested, and the n-dimensional cross product is explained and its properties are mentioned. The conversation also mentions the use of an alternating tensor and its relationship to the cross product in 3-dimensional space.
  • #1
logarithmic
107
0
I'm sure there's a simple way of doing this but I just can't think of it.

In R^n, given k<n linearly independent vectors, how do you find a vector that is orthogonal to all given vectors.

I know cross product works for R^3, but what about R^n?
 
Physics news on Phys.org
  • #2
I assume you're taking a linear algebra course. Why not apply the techniques you've learned to solve the equation? Or... is it that you haven't even tried to convert the problem into an algebraic equation?
 
  • #3
Hurkyl said:
I assume you're taking a linear algebra course. Why not apply the techniques you've learned to solve the equation? Or... is it that you haven't even tried to convert the problem into an algebraic equation?
Actually I have tried that.

Suppose we're working in R^4. And we're given two vectors (1,0,1,0) and (0,-2,-1,1). Now I want to find (a,b,c,d) that is perpendicular to both.

So the equations I have are
<(1,0,1,0)|(a,b,c,d)>=0
<(0,-2,-1,1)|(a,b,c,d)>=0

The inner product here is the dot product.

This gives me 2 linear equation, when I need to solve for 4 variables which isn't enough. Any help in finding another 2 equation so the system can be solved?
 
  • #4
logarithmic said:
This gives me 2 linear equation, when I need to solve for 4 variables which isn't enough.
What do you mean? You know full well how to find the complete solution to any system of linear equations, no matter how many variables and equations there happen to be.

If, for whatever reason, you really and truly must have a square coefficient matrix, then you can always add the equations 0 = 0 and 0 = 0. :tongue: (Or replicate your other equations)
 
Last edited:
  • #5
Yes, if you are working in 4 dimensional space, the set of all vectors orthogonal to 2 given vectors is a 2 dimensional subspace, just as in 3 dimensions, the set of all vectors orthogonal to a single given vector is a 2 dimensional subpace. In n dimensions you have to give n-1 vectors before you can find a single vector such that all orthogonal vectors are multiples of that (a one-dimensional subspace).

In this example, you have equations a+ c= 0 and -2b- c+ d= 0. We can solve two (independent) equations in four variables for 2 of the variables in terms of the othe two. Here, from the first equation, c= -a. Putting that into the second equation -2b+ a+ d= 0 so d= -a+ 2b. Choose whatever numbers you like for a and b and you will get a vector orthogonal to the original two. In particular, if you take a=1, b= 0, you get c= -1, d= -1: the vector (1, 0, -1, -1) is orthogonal to the two given vectors. If, instead, you take a= 0, b= 1, you get c= 0 and d= 2: the vector (0, 1, 0, 2) is also orthogonal to the two given vectors. In fact, the set of all vectors orthogonal to the two given vectors is the subspace spanned by {(1,0,-1,-1), (0,1,0,2)}.
 
  • #6
I have a way to get an ortoghonal vector for any two vectors in any dimension larger or equal to 3. It actually does not involve finding the entire space of possible orthogonal vectors.

It must be that this method (nD cross product) gives you a special member of the possible orthogonal vectors because the product satisfies the requirements for a Lie Algebra and most of the other properties of the 3D cross product.

I will post it somwhere (please sugest where to).:bugeye:
 
  • #7
Do you really want us to tell you where to put it?:rofl:

Is this the usual product of n vectors with the alternating tensor?
 
  • #8
Yes because I cannot retype it using TeX on the forum while online. Alternatively please advise where I can get a TeX editor.

No it is not a product of n vectors just of 2 in any dimension >= 3.

I'm not sure about the alternating tensor definition. Please direct me to the definition. I read Algebraic Geometry and it fits in with the determinant expansion of the cross product.
 
  • #9
The alteranating tensor (which is not, I think, a true tensor) is [itex]\epsilon_{i1i2i3...in}[/itex] equal to 1 if [itex]i1i2i3...in[/itex] is an even permutation of [itex]123...n[/itex], -1 if an odd permutation and 0 in all other cases (basically if any index appears twice. In three dimensions the cross product can be written as
[tex](u\times v)_k= \sum_{i=1}^n\sum_{j=1}^n\epsilon_{ijk}u_iv_j[/tex]
 
Last edited by a moderator:
  • #10
I don't understand the

[tex]\sum[/tex] j.

What is the 1[tex]^{n}[/tex] ?

That definition cannot be extended directly to 4D and larger because it will look like:

(u x v)[tex]_{k} = (-1)^{k} \epsilon_{i j m k} u_{i} v_{j}[/tex]

in 4D. In which case you don't know what to do with the m index. I'm not sure about the (-1)[tex]^{k}[/tex] now because the tensor goes to (-1) or 1.

My definition in the 4D case fits in with this. You make the tables of even permutations of three indexes and select adjacent pairs. For k = 1 this gives:

23 | 42
__| 34

which is the pairs that go into i, j (in this order) in [tex]u_{i} v_{j}[/tex] and in reverse order into [tex]- u_{i} v_{j}[/tex]. Then you get a sum of 6 terms for k=1.

For k = 2 you need odd permutations which are even permutations and one transposition. Thus for k = 2 you need transpositions of:

13 | 41
__| 34

Note this is: 1 replace 2 in the previous triangle.

Transpose this:

31 | 14
__| 43

This goes into i, j like above.

Similarly for k = 3:

12 | 41
__| 24

For k = 4 transpositions of:

12 | 31
___| 23

equals:

21 | 13
| 32

The nD version uses extensions of these triangles.

I will attach a doc file soon.
 
  • #11
Here is one version of the nD cross product of two vectors (see attachment).

The properties are not proved in this version, it will be posted later.
 

Attachments

  • Vector Product, Product Reduction.doc
    138 KB · Views: 292
Last edited:
  • #12
talanum1 said:
I don't understand the

[tex]\sum[/tex] j.

What is the 1[tex]^{n}[/tex] ?
That was a typo. I have editted it.

That definition cannot be extended directly to 4D and larger because it will look like:

(u x v)[tex]_{k} = (-1)^{k} \epsilon_{i j m k} u_{i} v_{j}[/tex]

in 4D. In which case you don't know what to do with the m index. I'm not sure about the (-1)[tex]^{k}[/tex] now because the tensor goes to (-1) or 1.
Of course it can't. The crucial property of a cross product of two vectors, in 3 dimensions, is that it is orthogonal to the both vectors, and spans the space of all vectors orthogonal to both. In 4 dimensions, the space of all vectors orthogonal to a given two vectors has dimension 2 and cannot be spanned by a single vector. To get a single vector, you need to "cross multiply" THREE vectors:
[tex](\vec{u}\times\vec{v}\times\vec{w})k= \sum_h\sum_i\sum_k e_{ijkh}u_iv_jw_k[/itex]

My definition in the 4D case fits in with this. You make the tables of even permutations of three indexes and select adjacent pairs. For k = 1 this gives:

23 | 42
__| 34

which is the pairs that go into i, j (in this order) in [tex]u_{i} v_{j}[/tex] and in reverse order into [tex]- u_{i} v_{j}[/tex]. Then you get a sum of 6 terms for k=1.

For k = 2 you need odd permutations which are even permutations and one transposition. Thus for k = 2 you need transpositions of:

13 | 41
__| 34

Note this is: 1 replace 2 in the previous triangle.

Transpose this:

31 | 14
__| 43

This goes into i, j like above.

Similarly for k = 3:

12 | 41
__| 24

For k = 4 transpositions of:

12 | 31
___| 23

equals:

21 | 13
| 32

The nD version uses extensions of these triangles.

I will attach a doc file soon.
 
  • #13
I stated that it gives a special vector orthogonal to both. I have the proof of this. I will post it soon.

It doesn't span 2D or larger but the product isn't useless or obsolete since it leads to non square determinants - look at the attachment and you'll see. Call it something else then like: "nD Shift Permutation Vector Product."
 
  • #14
Actually it looks like the table does not work (as the construction specifies) for odd dimension because you have [1]x[2] not = - [2]x[1] when using the table in 5D.

[1]x[2] = +
[1]x[3] = -
[1]x[4] = +
[1]x[5] = -
[2]x[1] = +

But the number triangle and formula does satisfy orthogonality and anti-commutivity. In my terminology this vector product is:

u x v = [k] (-1)^(k+1) { uIO_nm u_n v_m } ( 1 )

with the same number triangles and Einstein summation assmed. The orthogonality property is proved using the 3xn determinant expression for the vector product using "constant multiple of rows", "sum of binomials" and "proportionality of rows" properties (in this order).
 
  • #15
In R^n, any set of n vectors {v_1,...,v_n} can be used to obtain an orthogonal set {u_1,...,u_n} so that for all k Sp{u_1,...,u_k} = Sp{v_1,...,v_k}. One algorithm to to this is the Gram Schmidt process:
http://en.wikipedia.org/wiki/Gram–Schmidt_process
It should be apparent from the process hoe to solve your problem.
 
  • #16
daniel_i_l said:
In R^n, any set of n vectors {v_1,...,v_n} can be used to obtain an orthogonal set {u_1,...,u_n} so that for all k Sp{u_1,...,u_k} = Sp{v_1,...,v_k}.
That's not true. The original set of {v_1, ..., v_n} vectors have to be an basis for R^n.

One algorithm to to this is the Gram Schmidt process:
http://en.wikipedia.org/wiki/Gram–Schmidt_process
It should be apparent from the process hoe to solve your problem.
What problem are you talking about?

The original problem was "In R^n, given k<n linearly independent vectors, how do you find a vector that is orthogonal to all given vectors."

If you add n-k independent vectors to the given k vectors to make a basis for R^n and then apply Gram-Schmidt, the vectors {u_1,..., u_n} are not necessarily orthogonal to the original set of k vectors so does NOT solve the problem.
 
Last edited by a moderator:
  • #17
HallsofIvy said:
If you add n-k independent vectors to the given k vectors to make a basis for R^n and then apply Gram-Schmidt, the vectors {u_(k+1),..., u_n} are not necessarily orthogonal to the original set of k vectors so does NOT solve the problem.

Why not? The given set of k vectors and the set obtained by applying Gram-Schmidt have the same span.
 
  • #18
Error in attachment above:

The reduction rule was left out. It states that the 2xp matrices with deleted columns such that two columns remain can be reduced by leaving out the deleted columns after columns have been transposed such that all the deleted entries are at rightmost and the rest of the indices are still in < order from left to right (and the term multiplied by -1 if an odd amount of transpositions was required).

For px2 matrices with deleted rows the deleted entries must be at bottommost (and similar multiplication is required).
 
  • #19
The relevance of Gram-Schmidt here is that it's based on the same trick that the OP needs to use. E.g. when you have found two of the vectors in the orthonormal basis you're constructing, Gram-Schmidt tells you how to take a vector that isn't in the subspace spanned by the first two basis vectors and use the three vectors to construct a third basis vector that's orthogonal to the first two.

If you want to find a vector that's orthogonal to the subspace U of the vector space V, then pick any vector x in V and let y be its projection onto U. x-y is orthogonal to U.
 

What does it mean for a vector to be orthogonal?

An orthogonal vector is one that is perpendicular to another vector. This means that the two vectors form a 90 degree angle with each other.

How do I find a vector that is orthogonal to two given vectors?

To find a vector that is orthogonal to two given vectors, you can use the cross product. Take the cross product of the two vectors, and the resulting vector will be perpendicular to both of them.

Can a vector be orthogonal to more than two vectors?

Yes, a vector can be orthogonal to more than two vectors. In fact, any number of vectors can be orthogonal to one another as long as they are all perpendicular to each other.

Why is finding orthogonal vectors important in mathematics and science?

Finding orthogonal vectors is important in mathematics and science because it allows us to easily solve problems involving multiple vectors. It also has many applications in fields such as physics, engineering, and computer graphics.

Are there any real-life examples of orthogonal vectors?

Yes, there are many real-life examples of orthogonal vectors. For example, the force of gravity is always orthogonal to the surface of the Earth, making it perpendicular to the ground. In computer graphics, orthogonal vectors are used to create 3D images and animations.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
380
Replies
2
Views
937
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
767
  • Linear and Abstract Algebra
Replies
5
Views
955
Replies
2
Views
740
  • Linear and Abstract Algebra
Replies
6
Views
758
  • Linear and Abstract Algebra
Replies
9
Views
825
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Back
Top