Cross product in arbitrary field

Click For Summary
SUMMARY

The discussion centers on proving that if the cross product of two vectors \( a \) and \( b \) in an arbitrary field \( \mathbb{F} \) is zero, then \( a \) and \( b \) are linearly dependent. An attempted counterexample using vectors in \( \mathbb{Z}_5^3 \) fails because it reveals that the vectors are indeed dependent. The conclusion is that the claim holds true across arbitrary fields, supported by determinant properties in linear algebra, specifically when \( a \) and \( b \) are expanded into a basis.

PREREQUISITES
  • Understanding of vector spaces over arbitrary fields
  • Knowledge of the properties of the cross product in \( \mathbb{R}^3 \)
  • Familiarity with determinants and their role in linear algebra
  • Basic concepts of linear dependence and independence
NEXT STEPS
  • Study the properties of vector spaces over finite fields, particularly \( \mathbb{Z}_p \)
  • Learn about the geometric interpretation of the cross product in higher dimensions
  • Explore the relationship between determinants and linear transformations
  • Investigate the implications of linear dependence in various algebraic structures
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in the properties of vector spaces and fields will benefit from this discussion.

jostpuur
Messages
2,112
Reaction score
19
Let \mathbb{F} be an arbitrary field, and let a,b\in\mathbb{F}^3 be vectors of the three dimensional vector space. How do you prove that if a\times b=0, then a and b are linearly dependent?

Consider the following attempt at a counter example: In \mathbb{R}^3

<br /> \left(\begin{array}{c}<br /> 1 \\ 4 \\ 2 \\<br /> \end{array}\right)<br /> \times\left(\begin{array}{c}<br /> 2 \\ 3 \\ 4 \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> 10 \\ 0 \\ -5 \\<br /> \end{array}\right)<br />

holds. Since 5 is a prime number, \mathbb{Z}_5 is a field. In (\mathbb{Z}_5)^3

<br /> \left(\begin{array}{c}<br /> [ 1] \\ [ 4] \\ [ 2] \\<br /> \end{array}\right)<br /> \times \left(\begin{array}{c}<br /> [2] \\ [ 3] \\ [ 4] \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> [ 0] \\ [ 0] \\ [ 0] \\<br /> \end{array}\right)<br />

This might look like a counter example to the claim. One might consider the possibility that perhaps the claim is true for example when \mathbb{F} is a subfield of \mathbb{C}, but not in general?

A closer look reveals that the counter example attempt does not work, because

<br /> [ 2] \left(\begin{array}{c}<br /> [ 1] \\ [ 4] \\ [ 2] \\<br /> \end{array}\right)<br /> = \left(\begin{array}{c}<br /> [ 2] \\ [ 3] \\ [ 4] \\<br /> \end{array}\right)<br />

holds in (\mathbb{Z}_5)^3. Finding a counter example is difficult, and it seems that the claim is true after all. I only know how to prove the claim using determinants when \mathbb{F} is a subfield of \mathbb{C}.
 
Physics news on Phys.org
Assume that ##a## and ##b## are linearly independent in the field ##\mathbb{F}^3##, then you can expand them to a basis ##(c,a,b)##. Because of linear algebra, this means that the matrix formed by ##a##, ##b## and ##c## is invertible and thus has nonzero determinant. But this determinant is ##c\cdot (a\times b)##. Since this is nonzero, it implies that ##a\times b## must be nonzero.
 
  • Like
Likes   Reactions: Lebesgue

Similar threads

  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K