Confusion about basis vectors and matrix tensor

Click For Summary
SUMMARY

The discussion clarifies the distinction between basis vectors and their properties, specifically addressing the confusion surrounding orthogonality and linear independence. It references "A Student's Guide to Vectors and Tensors" by Daniel Fleisch, emphasizing that while orthogonality (e_i · e_j = 0 for i ≠ j) is convenient, it is not a requirement for a basis. The dual basis is introduced as a method to achieve orthogonality with arbitrary bases, highlighted by the relation e_i · e^j = δ_i^j, where δ_i^j is the Kronecker delta. The conversation concludes with practical examples demonstrating that independent vectors can form a basis without being orthogonal.

PREREQUISITES
  • Understanding of linear independence in vector spaces
  • Familiarity with inner products and their properties
  • Knowledge of dual bases and their significance
  • Basic comprehension of the Kronecker delta symbol
NEXT STEPS
  • Study the properties of linear independence in vector spaces
  • Learn about the implications of orthogonality in basis vectors
  • Explore the concept of dual bases in more depth
  • Investigate the applications of the Kronecker delta in tensor analysis
USEFUL FOR

Students and professionals in mathematics, physics, and engineering who are studying vector spaces, tensor analysis, or linear algebra concepts, particularly those interested in the properties of basis vectors and dual bases.

nomadreid
Gold Member
Messages
1,771
Reaction score
255
In "A Student's Guide to Vectors and Tensors" by Daniel Fleisch, I read that the covariant metric tensor gij=ei°ei (I'm leaving out the → s above the e's) where ei and ei are coordinate basis vectors and ° denotes the inner product, and similarly for the contravariant metric tensor using dual base vectors. But I thought the definition of base vectors included that the inner product of two distinct ones of the same type was zero, and similarly for dual base vectors. (For example, the basis vectors of Rn, with the inner product = the dot product.) Where is my thinking wrong?
 
Physics news on Phys.org
All you need for a basis is linear independence. If you also have orthogonality -- i.e., if e_i \cdot e_j = 0 when i \ne j -- that can be convenient, but it's not necessary.

The dual basis is the tool we use to get this same convenience (zero dot products) with an arbitrary basis. The main relation here is:
<br /> e_i \cdot e^j = \delta_i^j<br />
where \delta_i^j is the Kronecker delta symbol (1 if i=j; 0 otherwise).
 
  • Like
Likes   Reactions: 1 person
Ah, thanks, it did not occur to me that, although orthogonality implied linear independence, the converse is not true. This spurred me to try some examples, and I see that one could make a basis of independent vectors {(1,1), (1,0)} to span R2 although they are not orthogonal, having a dot product of 1. OK, case closed. Thanks again.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
8K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K