Exploring the Connection Between Dual Vectors and Covectors in Vector Spaces

In summary: The key to understanding this is to realize that a functional is a map from a vector space to itself - in other words, it's a way of representing a vector in terms of itself. So, if we have a vector space V and a functional F: V -> V, then F is a way of writing "V" as a row vector in V. Now, suppose we have another vector space W, and we want to represent the same vector "V" in W as a covector. We can do this by mapping F to the covector "C": C(F(V)) = V. This is called the product rule for covectors.
  • #1
Kaguro
221
57
TL;DR Summary
I want to know how are linear functionals same as covectors.
I understand that a vector space is a set of objects closed under addition and scalar multiplication and satisfies several properties.

A functional is a map that takes a vector and produces a scalar. A functional is also called a dual vector.

A covector is an object which transforms via the same matrix that the basis vectors use to transform under a change of coordinates(while the contravectors use its inverse).

How are these two related?
 
Physics news on Phys.org
  • #2
Kaguro said:
Summary:: I want to know how are linear functionals same as covectors.

I understand that a vector space is a set of objects closed under addition and scalar multiplication and satisfies several properties.

A functional is a map that takes a vector and produces a scalar. A functional is also called a dual vector.

A covector is an object which transforms via the same matrix that the basis vectors use to transform under a change of coordinates(while the contravectors use its inverse).

How are these two related?
Is this not covered in your course or textbook? You need to find a source that presents this topic fully.
 
  • #3
I have studied Linear Algebra in last year from the book of S. Andrili.
This semester,I have tensor algebra and calculus with some intro to vector spaces. This we're studying from a wide variety of books,notes and YT lectures. The only book I read formally is Arfken,Weber,Harris.

So I don't know of a good resource to describe this.
 
  • #4
Kaguro said:
I have studied Linear Algebra in last year from the book of S. Andrili.
This semester,I have tensor algebra and calculus with some intro to vector spaces. This we're studying from a wide variety of books,notes and YT lectures. The only book I read formally is Arfken,Weber,Harris.

So I don't know of a good resource to describe this.
Is this for physics or mathematics?
 
  • #5
I am a student of Physics. And I have taken tensors as an elective. That course mainly deals with Cartesian tensors,and very minimal amount of general tensors.
 
  • #6
PHYSICS-DSE: ADVANCED MATHEMATICAL PHYSICS-I
(Credits: Theory-04, Practicals-02)
Theory: 60 Lectures

The emphasis of the course is on applications in solving problems of interest to
physicists. Students are to be examined on the basis of problems, seen and unseen.

Linear Vector Spaces
Abstract Systems. Binary Operations and Relations. Introduction to Groups and Fields.
Vector Spaces and Subspaces. Linear Independence and Dependence of Vectors. Basis
and Dimensions of a Vector Space. Change of basis. Homomorphism and Isomorphism
of Vector Spaces. Linear Transformations. Algebra of Linear Transformations. Non-
singular Transformations. Representation of Linear Transformations by Matrices.
(12 Lectures)

Matrices
Addition and Multiplication of Matrices. Null Matrices. Diagonal, Scalar and Unit
Matrices. Upper-Triangular and Lower-Triangular Matrices. Transpose of a Matrix.
Symmetric and Skew-Symmetric Matrices. Conjugate of a Matrix. Hermitian and Skew-
Hermitian Matrices. Singular and Non-Singular matrices. Orthogonal and Unitary
Matrices. Trace of a Matrix. Inner Product.
(8 Lectures)
Eigen-values and Eigenvectors. Cayley- Hamiliton Theorem. Diagonalization of
Matrices. Solutions of Coupled Linear Ordinary Differential Equations. Functions of a
Matrix.
(10 Lectures)

Cartesian Tensors
Transformation of Co-ordinates. Einstein’s Summation Convention. Relation between
Direction Cosines. Tensors. Algebra of Tensors. Sum, Difference and Product of Two
Tensors. Contraction. Quotient Law of Tensors. Symmetric and Anti-symmetric
Tensors. Invariant Tensors : Kronecker and Alternating Tensors. Association of
Antisymmetric Tensor of Order Two and Vectors. Vector Algebra and Calculus using
Cartesian Tensors : Scalar and Vector Products, Scalar and Vector Triple Products.
Differentiation. Gradient, Divergence and Curl of Tensor Fields. Vector Identities.
(20 lectures)

General Tensors
Transformation of Co-ordinates. Minkowski Space. Contravariant & Covariant Vectors.
Contravariant, Covariant and Mixed Tensors. Kronecker Delta and Permutation Tensors.
Algebra of Tensors. Sum, Difference & Product of Two Tensors. Contraction. Quotient
Law of Tensors. Symmetric and Anti-symmetric Tensors. Metric Tensor. (10 Lectures)
This is my syllabus.
 
  • #7
Kaguro said:
Summary:: I want to know how are linear functionals same as covectors.

I understand that a vector space is a set of objects closed under addition and scalar multiplication and satisfies several properties.

A functional is a map that takes a vector and produces a scalar. A functional is also called a dual vector.

A covector is an object which transforms via the same matrix that the basis vectors use to transform under a change of coordinates(while the contravectors use its inverse).

How are these two related?

The set of linear functionals (also called covectors, dual vectors or one-forms) is a vector space that can fairly simply be shown, directly from the definition, to have transformation rules that are related to the vector transformation rules.

Have you not seen that? It's a few lines of algebra to prove - although conceptually it may take some time and practice to digest.
 
  • #8
Yes,yes, the set of all linear functionals itself is a vector space. True. But why is a functional(a row vector) same as a covector?
 
  • #9
Kaguro said:
Yes,yes, the set of all linear functionals itself is a vector space. True. But why is a functional(a row vector) same as a covector?
By definition. They are just different names for the same thing.

One thing you need to prove, of course, is that not only do the linear functionals have the required transformation properties, but anything that has these transformation properties can be associated with a linear functional.
 
  • #11
Okay... So everyone just says a covector is a linear functional... Great, we have two different words to describe the exact same thing.:mad:

So, tying with what I understand as a covector,
prove that:
$$v'_j=\frac{\partial x^i}{\partial x'^j} v_i$$

You prove that a functional satisfies this.
 
  • #12
PeroK said:
By definition. They are just different names for the same thing.

One thing you need to prove, of course, is that not only do the linear functionals have the required transformation properties, but anything that has these transformation properties can be associated with a linear functional.
Yes, just this! How to prove this?
 
  • #13
Kaguro said:
Okay... So everyone just says a covector is a linear functional... Great, we have two different words to describe the exact same thing.:mad:

So, tying with what I understand as a covector,
prove that:
$$v'_j=\frac{\partial x^i}{\partial x'^j} v_i$$

You prove that a functional satisfies this.
There are four words: linear functional, covector, dual vector and one-form.

There are a number of key elements here:

1) A linear functional is completely defined by its action on a basis. If we take any basis ##e_1, \dots e_n##, and define $$A(e_k) = a_k$$ then, by linearity $$A(v) = A(v^ke_k) = v^ka_k$$.

Note: we can already see that the action of ##A## "looks like" the action of an inner product of some vector with components ##v^k## and some "covector" with components ##a_k##.

2) Then we define the linear functional ##f^m## so that it maps the basis vector ##e_k## to ##1## and all other basis vectors to ##0##. I.e. $$f^m(e_k) = \delta^m_k$$ Then we see that the set ##f^m## forms a basis for all linear functionals. Any linear functional ##A## can be written: $$A = a_kf^k$$

3) We use this to show that, under a change of coordinates, we have the required transformation properties for covector components and basis covectors. In general you have $$A(v) = a_kf^k(v^me_m) = a'_kf'^k(v'^me'_m)$$ Then you could take the action of a basis covector on a basis vector: $$\delta^k_m = f'^k(e'_m) = f'^k(\frac{\partial x^r}{\partial x'^m} e_r) = \frac{\partial x^r}{\partial x'^m} f'^k(e_r)$$ And this is satisfied by $$f'^k(e_r) = \frac{\partial x'^k}{\partial x^r}$$ hence $$f'^k = \frac{\partial x'^k}{\partial x^s}f^s$$ Which you can check out.
Does that get you started?

You may want to post these things as homework problems if you can't get the results to come out.
 
  • Like
  • Informative
Likes Kaguro and etotheipi
  • #14
PeroK said:
hence $$f'^k = \frac{\partial x'^k}{\partial x^s}f^s$$ Which you can check out.
This looks like a contravariant transformation law...

So... The components of the functionals in this basis has to be covariant? Just like in the ordinary space, the basis are covariant and the vector components in that basis are contravariant! Here, the exact opposite! The dual basis is contravariant, and so the functional components in that basis are covariant! Since that's the only way the functional itself can be invariant under a change of basis.
 
  • Like
Likes PeroK
  • #15
Exactly, it all comes from the linear action of the functionals on vectors. The transformation laws follow as in my post above.
 
  • #16
Thank you very much PeroK! You have always been there for me!
 
  • Like
Likes PeroK

1. What is the difference between a dual vector and a covector?

A dual vector is a vector in the dual space, while a covector is a linear functional that maps vectors to scalars. In other words, a dual vector is an element of the dual space, while a covector is a function that operates on vectors.

2. How are dual vectors and covectors related to each other?

Dual vectors and covectors are closely related, as a covector can be seen as a dual vector acting on a vector. They are essentially two sides of the same coin, with dual vectors representing the geometric properties of a vector and covectors representing its algebraic properties.

3. Can a dual vector be represented as a column vector?

Yes, a dual vector can be represented as a column vector in the dual basis. This is because the dual basis is a set of linear functionals that can be used to represent a dual vector as a linear combination of basis vectors.

4. What is the significance of dual vectors and covectors in physics?

Dual vectors and covectors play a crucial role in physics, particularly in the field of relativity. In this context, dual vectors represent the energy and momentum of a system, while covectors represent the spacetime curvature. They also help in formulating physical laws in a coordinate-independent manner.

5. How are dual vectors and covectors used in machine learning?

In machine learning, dual vectors and covectors are used to represent data and model parameters, respectively. This allows for efficient computation and optimization of machine learning algorithms. They are also used in kernel methods, which are popular in machine learning for handling non-linearly separable data.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
259
  • Linear and Abstract Algebra
Replies
17
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
309
  • Linear and Abstract Algebra
Replies
6
Views
888
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
452
  • Linear and Abstract Algebra
Replies
7
Views
2K
Replies
24
Views
1K
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
16
Views
4K
Back
Top