- #1

- 68

- 2

Thank-you

- Thread starter logan3
- Start date

- #1

- 68

- 2

Thank-you

- #2

mfb

Mentor

- 35,258

- 11,510

With reasonable generalizations of the cross-product, you can show it in exactly the same way you do in R^3.

- #3

mathwonk

Science Advisor

Homework Helper

2020 Award

- 11,099

- 1,302

- #4

Matterwave

Science Advisor

Gold Member

- 3,965

- 326

Notice however that this generalization has orthogonality explicitly being part of its definition. So this generalization probably doesn't help the OP except to give him the trivial answer (that it is so by definition).in n space one defines the cross product of n-1 vectors. It is non zero if and only if they are independent and then it is theoriented orthogonal complementary vector to their linear span, with length given by the volume of the block they span. thus if e1,...,en is an oriented orthonormal basis, the cross product of e1,...,en-1, is en.

- #5

mathwonk

Science Advisor

Homework Helper

2020 Award

- 11,099

- 1,302

It seemed to me he asked 2 questions, 1) define the n dimensional cross product, and 2) prove orthogonality. there are of course two possible definitions, one conceptual which explains the meaning of the construction, in which orthogonality is part of the definition, and second, a purely numerical definition, which conceals the meaning until one calculates the orthogonality by a dot product computation. I always prefer the conceptual definition as more helpful to understanding, but of course one should include a proof that it exists, which in this case seemed clear.

Nonetheless if you prefer a numerical expression, then you may use the usual determinant expression for the cross product, i.e. given n-1 vectors, v1,...,vn-1, define a linear function of a vector w by the determinant of the matrix with rows v1,...,vn-1,w. Then this function equals the dot product of w with a unique vector, called the cross product of the v's. The orthogonality then follows from the usual properties of the determinant, i.e. it is zero in this case if and only if w depends linearly on the v's.

(A purely numerical expression for the coefficients of the cross product in terms of the coefficients of the v's is obtained by expanding formally the determinant with the v's in the first n-1 rows and the unit vectors e1,...,en as entries in the last row. Only a masochist would prove the orthogonality by then taking the n-1 dot products.)

Nonetheless if you prefer a numerical expression, then you may use the usual determinant expression for the cross product, i.e. given n-1 vectors, v1,...,vn-1, define a linear function of a vector w by the determinant of the matrix with rows v1,...,vn-1,w. Then this function equals the dot product of w with a unique vector, called the cross product of the v's. The orthogonality then follows from the usual properties of the determinant, i.e. it is zero in this case if and only if w depends linearly on the v's.

(A purely numerical expression for the coefficients of the cross product in terms of the coefficients of the v's is obtained by expanding formally the determinant with the v's in the first n-1 rows and the unit vectors e1,...,en as entries in the last row. Only a masochist would prove the orthogonality by then taking the n-1 dot products.)

Last edited:

- Replies
- 13

- Views
- 12K

- Replies
- 4

- Views
- 3K

- Replies
- 2

- Views
- 2K

- Replies
- 3

- Views
- 1K

- Replies
- 1

- Views
- 2K

- Replies
- 1

- Views
- 1K

- Replies
- 3

- Views
- 2K

- Replies
- 3

- Views
- 2K

- Last Post

- Replies
- 4

- Views
- 6K

- Last Post

- Replies
- 18

- Views
- 21K