Finding a vector orthogonal to others


by logarithmic
Tags: orthogonal, vector
logarithmic
logarithmic is offline
#1
Nov1-08, 07:30 AM
P: 108
I'm sure there's a simple way of doing this but I just can't think of it.

In R^n, given k<n linearly independent vectors, how do you find a vector that is orthogonal to all given vectors.

I know cross product works for R^3, but what about R^n?
Phys.Org News Partner Science news on Phys.org
Better thermal-imaging lens from waste sulfur
Hackathon team's GoogolPlex gives Siri extra powers
Bright points in Sun's atmosphere mark patterns deep in its interior
Hurkyl
Hurkyl is offline
#2
Nov1-08, 07:32 AM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
I assume you're taking a linear algebra course. Why not apply the techniques you've learned to solve the equation? Or... is it that you haven't even tried to convert the problem into an algebraic equation?
logarithmic
logarithmic is offline
#3
Nov1-08, 07:44 AM
P: 108
Quote Quote by Hurkyl View Post
I assume you're taking a linear algebra course. Why not apply the techniques you've learned to solve the equation? Or... is it that you haven't even tried to convert the problem into an algebraic equation?
Actually I have tried that.

Suppose we're working in R^4. And we're given two vectors (1,0,1,0) and (0,-2,-1,1). Now I want to find (a,b,c,d) that is perpendicular to both.

So the equations I have are
<(1,0,1,0)|(a,b,c,d)>=0
<(0,-2,-1,1)|(a,b,c,d)>=0

The inner product here is the dot product.

This gives me 2 linear equation, when I need to solve for 4 variables which isn't enough. Any help in finding another 2 equation so the system can be solved?

Hurkyl
Hurkyl is offline
#4
Nov1-08, 08:44 AM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101

Finding a vector orthogonal to others


Quote Quote by logarithmic View Post
This gives me 2 linear equation, when I need to solve for 4 variables which isn't enough.
What do you mean? You know full well how to find the complete solution to any system of linear equations, no matter how many variables and equations there happen to be.

If, for whatever reason, you really and truly must have a square coefficient matrix, then you can always add the equations 0 = 0 and 0 = 0. (Or replicate your other equations)
HallsofIvy
HallsofIvy is offline
#5
Nov1-08, 08:51 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,881
Yes, if you are working in 4 dimensional space, the set of all vectors orthogonal to 2 given vectors is a 2 dimensional subspace, just as in 3 dimensions, the set of all vectors orthogonal to a single given vector is a 2 dimensional subpace. In n dimensions you have to give n-1 vectors before you can find a single vector such that all orthogonal vectors are multiples of that (a one-dimensional subspace).

In this example, you have equations a+ c= 0 and -2b- c+ d= 0. We can solve two (independent) equations in four variables for 2 of the variables in terms of the othe two. Here, from the first equation, c= -a. Putting that into the second equation -2b+ a+ d= 0 so d= -a+ 2b. Choose whatever numbers you like for a and b and you will get a vector orthogonal to the original two. In particular, if you take a=1, b= 0, you get c= -1, d= -1: the vector (1, 0, -1, -1) is orthogonal to the two given vectors. If, instead, you take a= 0, b= 1, you get c= 0 and d= 2: the vector (0, 1, 0, 2) is also orthogonal to the two given vectors. In fact, the set of all vectors orthogonal to the two given vectors is the subspace spanned by {(1,0,-1,-1), (0,1,0,2)}.
talanum1
talanum1 is offline
#6
Nov5-08, 11:14 AM
P: 25
I have a way to get an ortoghonal vector for any two vectors in any dimension larger or equal to 3. It actually does not involve finding the entire space of possible orthogonal vectors.

It must be that this method (nD cross product) gives you a special member of the possible orthogonal vectors because the product satisfies the requirements for a Lie Algebra and most of the other properties of the 3D cross product.

I will post it somwhere (please sugest where to).
HallsofIvy
HallsofIvy is offline
#7
Nov5-08, 06:05 PM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,881
Do you really want us to tell you where to put it?

Is this the usual product of n vectors with the alternating tensor?
talanum1
talanum1 is offline
#8
Nov6-08, 11:40 AM
P: 25
Yes because I cannot retype it using TeX on the forum while online. Alternatively please advise where I can get a TeX editor.

No it is not a product of n vectors just of 2 in any dimension >= 3.

I'm not sure about the alternating tensor definition. Please direct me to the definition. I read Algebraic Geometry and it fits in with the determinant expansion of the cross product.
HallsofIvy
HallsofIvy is offline
#9
Nov6-08, 05:01 PM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,881
The alteranating tensor (which is not, I think, a true tensor) is [itex]\epsilon_{i1i2i3...in}[/itex] equal to 1 if [itex]i1i2i3...in[/itex] is an even permutation of [itex]123...n[/itex], -1 if an odd permutation and 0 in all other cases (basically if any index appears twice. In three dimensions the cross product can be writen as
[tex](u\times v)_k= \sum_{i=1}^n\sum_{j=1}^n\epsilon_{ijk}u_iv_j[/tex]
talanum1
talanum1 is offline
#10
Nov7-08, 02:43 AM
P: 25
I don't understand the

[tex]\sum[/tex] j.

What is the 1[tex]^{n}[/tex] ?

That definition cannot be extended directly to 4D and larger because it will look like:

(u x v)[tex]_{k} = (-1)^{k} \epsilon_{i j m k} u_{i} v_{j}[/tex]

in 4D. In which case you don't know what to do with the m index. I'm not sure about the (-1)[tex]^{k}[/tex] now because the tensor goes to (-1) or 1.

My definition in the 4D case fits in with this. You make the tables of even permutations of three indexes and select adjacent pairs. For k = 1 this gives:

23 | 42
__| 34

which is the pairs that go into i, j (in this order) in [tex]u_{i} v_{j}[/tex] and in reverse order into [tex]- u_{i} v_{j}[/tex]. Then you get a sum of 6 terms for k=1.

For k = 2 you need odd permutations which are even permutations and one transposition. Thus for k = 2 you need transpositions of:

13 | 41
__| 34

Note this is: 1 replace 2 in the previous triangle.

Transpose this:

31 | 14
__| 43

This goes into i, j like above.

Similarly for k = 3:

12 | 41
__| 24

For k = 4 transpositions of:

12 | 31
___| 23

equals:

21 | 13
| 32

The nD version uses extensions of these triangles.

I will attach a doc file soon.
talanum1
talanum1 is offline
#11
Nov7-08, 05:29 AM
P: 25
Here is one version of the nD cross product of two vectors (see attachment).

The properties are not proved in this version, it will be posted later.
Attached Files
File Type: doc Vector Product, Product Reduction.doc (138.0 KB, 16 views)
HallsofIvy
HallsofIvy is offline
#12
Nov7-08, 06:02 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,881
Quote Quote by talanum1 View Post
I don't understand the

[tex]\sum[/tex] j.

What is the 1[tex]^{n}[/tex] ?
That was a typo. I have editted it.

That definition cannot be extended directly to 4D and larger because it will look like:

(u x v)[tex]_{k} = (-1)^{k} \epsilon_{i j m k} u_{i} v_{j}[/tex]

in 4D. In which case you don't know what to do with the m index. I'm not sure about the (-1)[tex]^{k}[/tex] now because the tensor goes to (-1) or 1.
Of course it can't. The crucial property of a cross product of two vectors, in 3 dimensions, is that it is orthogonal to the both vectors, and spans the space of all vectors orthogonal to both. In 4 dimensions, the space of all vectors orthogonal to a given two vectors has dimension 2 and cannot be spanned by a single vector. To get a single vector, you need to "cross multiply" THREE vectors:
[tex](\vec{u}\times\vec{v}\times\vec{w})k= \sum_h\sum_i\sum_k e_{ijkh}u_iv_jw_k[/itex]

My definition in the 4D case fits in with this. You make the tables of even permutations of three indexes and select adjacent pairs. For k = 1 this gives:

23 | 42
__| 34

which is the pairs that go into i, j (in this order) in [tex]u_{i} v_{j}[/tex] and in reverse order into [tex]- u_{i} v_{j}[/tex]. Then you get a sum of 6 terms for k=1.

For k = 2 you need odd permutations which are even permutations and one transposition. Thus for k = 2 you need transpositions of:

13 | 41
__| 34

Note this is: 1 replace 2 in the previous triangle.

Transpose this:

31 | 14
__| 43

This goes into i, j like above.

Similarly for k = 3:

12 | 41
__| 24

For k = 4 transpositions of:

12 | 31
___| 23

equals:

21 | 13
| 32

The nD version uses extensions of these triangles.

I will attach a doc file soon.
talanum1
talanum1 is offline
#13
Nov7-08, 07:26 AM
P: 25
I stated that it gives a special vector orthogonal to both. I have the proof of this. I will post it soon.

It doesn't span 2D or larger but the product isn't useless or obsolete since it leads to non square determinants - look at the attachment and you'll see. Call it something else then like: "nD Shift Permutation Vector Product."
talanum1
talanum1 is offline
#14
Nov18-08, 11:15 AM
P: 25
Actually it looks like the table does not work (as the construction specifies) for odd dimension because you have [1]x[2] not = - [2]x[1] when using the table in 5D.

[1]x[2] = +
[1]x[3] = -
[1]x[4] = +
[1]x[5] = -
[2]x[1] = +

But the number triangle and formula does satisfy orthogonality and anti-commutivity. In my terminology this vector product is:

u x v = [k] (-1)^(k+1) { uIO_nm u_n v_m } ( 1 )

with the same number triangles and Einstein summation assmed. The orthogonality property is proved using the 3xn determinant expression for the vector product using "constant multiple of rows", "sum of binomials" and "proportionality of rows" properties (in this order).
daniel_i_l
daniel_i_l is offline
#15
Nov18-08, 01:18 PM
PF Gold
daniel_i_l's Avatar
P: 867
In R^n, any set of n vectors {v_1,...,v_n} can be used to obtain an orthogonal set {u_1,...,u_n} so that for all k Sp{u_1,...,u_k} = Sp{v_1,...,v_k}. One algorithm to to this is the Gram Schmidt process:
http://en.wikipedia.org/wiki/Gram%E2...chmidt_process
It should be apparent from the process hoe to solve your problem.
HallsofIvy
HallsofIvy is offline
#16
Nov18-08, 01:50 PM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,881
Quote Quote by daniel_i_l View Post
In R^n, any set of n vectors {v_1,...,v_n} can be used to obtain an orthogonal set {u_1,...,u_n} so that for all k Sp{u_1,...,u_k} = Sp{v_1,...,v_k}.
That's not true. The original set of {v_1, ..., v_n} vectors have to be an basis for R^n.

One algorithm to to this is the Gram Schmidt process:
http://en.wikipedia.org/wiki/Gram%E2...chmidt_process
It should be apparent from the process hoe to solve your problem.
What problem are you talking about?

The original problem was "In R^n, given k<n linearly independent vectors, how do you find a vector that is orthogonal to all given vectors."

If you add n-k independent vectors to the given k vectors to make a basis for R^n and then apply Gram-Schmidt, the vectors {u_1,..., u_n} are not necessarily orthogonal to the original set of k vectors so does NOT solve the problem.
adriank
adriank is offline
#17
Nov21-08, 03:39 AM
P: 534
Quote Quote by HallsofIvy View Post
If you add n-k independent vectors to the given k vectors to make a basis for R^n and then apply Gram-Schmidt, the vectors {u_(k+1),..., u_n} are not necessarily orthogonal to the original set of k vectors so does NOT solve the problem.
Why not? The given set of k vectors and the set obtained by applying Gram-Schmidt have the same span.
talanum1
talanum1 is offline
#18
Dec10-08, 02:36 AM
P: 25
Error in attachment above:

The reduction rule was left out. It states that the 2xp matrices with deleted columns such that two columns remain can be reduced by leaving out the deleted columns after columns have been transposed such that all the deleted entries are at rightmost and the rest of the indices are still in < order from left to right (and the term multiplied by -1 if an odd amount of transpositions was required).

For px2 matrices with deleted rows the deleted entries must be at bottommost (and similar multiplication is required).


Register to reply

Related Discussions
How to find vector coordinates in non-orthogonal systems? Linear & Abstract Algebra 13
Find a unit vector with a positive first coordinate orthogonal to both 'a' and 'b' Calculus 3
Forming an orthogonal matrix whose 1st column is a given unit vector Calculus & Beyond Homework 3
A-orthogonal vector set Linear & Abstract Algebra 4
Equation of a plane orthogonal to a vector Introductory Physics Homework 46