## Generalizing Cross Product

I'm taking multivariate calculus and my teacher just introduced the concept of cross products a week ago. Reading the Wikipedia page, I see that cross products only work in three and seven dimensions, which is puzzling.

One use of the cross product for our class is to find the vector orthogonal to the 2 given vectors. My question is can this be generalized to n dimensions to find the vector orthogonal to the n-1 given vectors? Also what is the formal method/operation of doing this?

For example, given $u = \left(1,0,0,0\right)$, $v = \left(0,1,0,0\right)$, $w = \left(0,0,1,0\right)$, the vector orthogonal to u, v, and w is given by:

$$\left|\begin{array}{cccc} e_{1} & e_{2} & e_{3} & e_{4} \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{array}\right| = e_{4}$$

I read a bit about Hodge duality, exterior products, and k-vectors. Much of it was confusing, so could you clarify if you use them as I have little background in linear algebra or tensor theory.
 PhysOrg.com science news on PhysOrg.com >> Ants and carnivorous plants conspire for mutualistic feeding>> Forecast for Titan: Wild weather could be ahead>> Researchers stitch defects into the world's thinnest semiconductor
 Mentor Consider a nxn-matrix where the first n-1 columns (or rows) are filled with n-1 vectors. Now, for each entry in the remaining column (or row), use the determinant of the (n-1)x(n-1)-matrix you get by removing the last column (or row) and the row (or column) your entry is in. This might look complicated, but it is easy to show for the conventional cross-product: $$\begin{pmatrix} a_1 & b_1 &|& \color{red}{a_2b_3-b_2a_3}\\ \color{red}{a_2} & \color{red}{b_2} &|& a_3b_1-b_3a_1 \\ \color{red}{a_3} & \color{red}{b_3} &|& a_1b_2-b_1a_2 \end{pmatrix}$$ It gives a vector which is orthogonal to all other n-1 vectors.
 Recognitions: Gold Member Homework Help Science Advisor Here is an argument based on the fact that the scalar product is a nondegenerate bilinear form, meaning the map Rn-->(Rn)*: x--> is surjective. (Indeed, if f:Rn-->R is any linear map, then as a matrix, it is a 1xn matrix, i.e. a vector; call it x. Then f = . Here, (Rn)* is the set of all linear maps Rn-->R.) Consider now the linear map f:Rn-->R defined as "w --> determinant of the matrix whose first n-1 columns are the n-1 vectors v1,...,vn-1 you want the cross-product of, and whose nth column is just w". This is linear by properties of the determinant. So, there exists a vector x such that f = . This x is the cross-product of v1,...,vn-1 in the sense that = 0 for all k (by the property of the determinant that says that if the columns of A are linearly dependent, then det(A)=0).

Recognitions:
Homework Help