How to show u x v in R^n space is orthogonal to u and v?

  • #1
83
2
I had a problem where I showed that [itex]u \times v[/itex] in [itex]R^3[/itex] was orthogonal to [itex]u[/itex] and [itex]v[/itex]. I was wondering how I could show it for an [itex]R^n[/itex] space? Like, what is the notation/expression to represent a cross product in an [itex]R^n[/itex] space and how would I show that [itex]n[/itex]-number of coordinates cancel out?

Thank-you
 
  • #2
How do you define ##u \times v## in R^n?
With reasonable generalizations of the cross-product, you can show it in exactly the same way you do in R^3.
 
  • #3
in n space one defines the cross product of n-1 vectors. It is non zero if and only if they are independent and then it is the oriented orthogonal complementary vector to their linear span, with length given by the volume of the block they span. thus if e1,...,en is an oriented orthonormal basis, the cross product of e1,...,en-1, is en.
 
  • #4
in n space one defines the cross product of n-1 vectors. It is non zero if and only if they are independent and then it is the oriented orthogonal complementary vector to their linear span, with length given by the volume of the block they span. thus if e1,...,en is an oriented orthonormal basis, the cross product of e1,...,en-1, is en.

Notice however that this generalization has orthogonality explicitly being part of its definition. So this generalization probably doesn't help the OP except to give him the trivial answer (that it is so by definition).
 
  • #5
It seemed to me he asked 2 questions, 1) define the n dimensional cross product, and 2) prove orthogonality. there are of course two possible definitions, one conceptual which explains the meaning of the construction, in which orthogonality is part of the definition, and second, a purely numerical definition, which conceals the meaning until one calculates the orthogonality by a dot product computation. I always prefer the conceptual definition as more helpful to understanding, but of course one should include a proof that it exists, which in this case seemed clear.

Nonetheless if you prefer a numerical expression, then you may use the usual determinant expression for the cross product, i.e. given n-1 vectors, v1,...,vn-1, define a linear function of a vector w by the determinant of the matrix with rows v1,...,vn-1,w. Then this function equals the dot product of w with a unique vector, called the cross product of the v's. The orthogonality then follows from the usual properties of the determinant, i.e. it is zero in this case if and only if w depends linearly on the v's.

(A purely numerical expression for the coefficients of the cross product in terms of the coefficients of the v's is obtained by expanding formally the determinant with the v's in the first n-1 rows and the unit vectors e1,...,en as entries in the last row. Only a masochist would prove the orthogonality by then taking the n-1 dot products.)
 
Last edited:

Suggested for: How to show u x v in R^n space is orthogonal to u and v?

Replies
5
Views
560
Replies
1
Views
550
Replies
9
Views
919
Replies
7
Views
429
Replies
13
Views
1K
Replies
2
Views
172
Replies
4
Views
679
Back
Top