The vector to which a dual vector corresponds

nomadreid
Gold Member
Messages
1,771
Reaction score
255
TL;DR
In an non-orthonormal vector space V with inner product and its dual space V*, I presume the two situations are the same: (i) to an f in V* corresponds v in V, by Riesz, (ii) to a v in V corresponds to its dual f in V. There are some details here that confuse me.
I am a bit confused about dual space unit vectors in the case of non-orthonormal vector spaces.
(Reference: A Student's Guide to Vectors and Tensors, by Daniel Fleisch, Cambridge, 2012.)
I will be grateful for being corrected in the following:

Suppose V is a non-orthonormal vector space, and V* its dual space. Suppose e1 is in V and e1 its corresponding dual unit vector in V*.

One of the consequences, if I understand it correctly, of the Riesz, or Riesz-Frechet, Representation Theorem is that to e1
there corresponds a u in V such that for every w in V, e1(w) = the inner product of u and w.

Also, starting from e1 in V, e1 is its dual.

The natural conclusion is that u=e1 . Yet the above reference seems to say that they are not.

[1] Let w = another unit vector in V, one which is neither equal nor orthogonal to e1. That is, the inner product of e1 and w is nonzero. Yet on the other side, the above reference states (p 114) that eiejij.

[2] In the drawings, the author draws (e.g., pp 117, 119) both vectors and their duals in the same plane,which leads me to think that he is drawing not the actual duals, but the vectors in V which correspond to the duals. But he draws (the vector in V which, by Riesz, corresponds to) ei as being different to ei.

Thanks in advance for showing me where my understanding went awry.
 
Last edited:
  • Like
Likes   Reactions: Gavran
Physics news on Phys.org
What is a non-orthogonal vector space?
 
  • Like
Likes   Reactions: nomadreid
Ah, I guess I mean a non-orthonormal vector space; i,e, a vector space in which not all of the basis vectors are orthogonal to one another.
 
nomadreid said:
Ah, I guess I mean a non-orthonormal vector space; i,e, a vector space in which not all of the basis vectors are orthogonal to one another.
A vector space doesn't come with a basis. Do you mean a vector space with a fixed basis, which need not be orthonormal?
 
  • Like
Likes   Reactions: nomadreid
Read decent books: Finite-Dimensional Vector Spaces by Paul R. Halmos
 
  • Like
Likes   Reactions: nomadreid
martinbn said:
A vector space doesn't come with a basis. Do you mean a vector space with a fixed basis, which need not be orthonormal?
Yes
 
wrobel said:
Finite-Dimensional Vector Spaces by Paul R. Halmos
OK, downloaded.
 
The author (A Student's Guide to Vectors and Tensors, by Daniel Fleisch, Cambridge, 2012., Chapters 4.5 and 4.6) does not mention a dual vector space. He does not speak in the context of a vector space and its dual vector space. He speaks in the context of the same vector space with two different bases. The first one consists of vectors ## \vec e_1 ## and ## \vec e_2 ##, and the second one consists of vectors ## \vec e^1 ## and ## \vec e^2 ##. Chapters 4.5 and 4.6 are all about the same vector space and its two different bases.

The author calls basis vectors ## \vec e^1 ## and ## \vec e^2 ## “reciprocal” basis vectors because he defines them in a way that ## \vec e_i\circ \vec e^j=\delta_i^j ##, so they are reciprocal to original basis vectors ## \vec e_1 ## and ## \vec e_2 ##. “Dual basis vectors” is another term used for vectors that are reciprocal to original basis vectors.

nomadreid said:
One of the consequences, if I understand it correctly, of the Riesz, or Riesz-Frechet, Representation Theorem is that to e1
there corresponds a u in V such that for every w in V, e1(w) = the inner product of u and w.

Also, starting from e1 in V, e1 is its dual.

The natural conclusion is that u=e1 . Yet the above reference seems to say that they are not.
If ## f^1 ## in ## V^\ast ## is a corresponding functional of a vector ## \vec e^1 ## in ## V ##, then ## f^1 ## maps the vector ## \vec a ## in ## V ## to ## <\vec e^1,\vec a> ##.
 
  • Like
Likes   Reactions: nomadreid
The book is pointless. The author yanked random bits from differential geometry, linear algebra, calculus, and physics. It’s the best way to make sure the reader ends up understanding absolutely nothing.
 
  • #10
Thanks, Gavran. Your post explains the confusion very well and concisely, and clears up a whole lot of my confusion. Super!

As well, wrobel, I have been reading the book by Halmos that you recommended, and this also clears up a lot of my previous confusion. So many thanks for the recommendation.
 
  • Like
Likes   Reactions: Gavran
  • #11
Given any (finite dimensional) vector space V, it has a dual space V* consisting of linear functions f:V-->k, where k is the field of scalars. (Take k = real numbers, so we can take square roots to define length.) The most natural feature of V associated to such an f, is the "kernel" of f, i.e. the subspace of vectors in V where f has value zero. If f is itself not identically zero, this subspace ker(f), is a "hyperplane" in V, i.e. a subspace of "codimension one", i.e. a subspace of dimension one less than V itself.

Thus to each non zero element f in V*, we have a naturally associated hyperplane in V, not a naturally associated vector. However, if V has an inner product, then there is a notion of perpendicularity, and length, for vectors in V. Every hyperplane has a unique perpendicular line, and the values of the function f are entirely determined by the value of f at any one non zero vector on that line. E.g. if we choose a vector u of length one on that line, i.e. with <u,u> = 1, then the function f is determined by the value f(u) = c.

To associate f to a single vector, we want to find a vector w on that line whose dot product with u also equals c. Then the value of f at any vector x in V will equal the dot product of w with x. This is easy, just take w = cu, and then <w,u> = <cu,u> = c<u,u> = c. Thus for any vector on that line, say tu, we have <w,tu> = <cu,tu> = c<u,tu> = tc<u,u> = tc = tf(u) = f(tu), as desired.

I.e. if u is any vector perpendicular to the hyperplane ker(f), and of length one, we associate f to the vector w = f(u)u. Then for every vector x in V, f(x) = <w,x>.

To prove that in detail, we can argue as follows. Note that every vector x in V can be expressed as a sum x = tu + y, where y lies in the hyperplane ker(f) and tu is perpendicular to that hyperplane (as is w). Then f(x) = f(y+tu) = f(y) + tf(u) = 0 + tf(u) = tf(u)<u,u> = <f(u)u,tu> = <w,tu> = 0 + <w,tu> = <w,y>+<w,tu> = <w,y+tu> = <w,x>.
 
Last edited:
  • Like
Likes   Reactions: Gavran
  • #12
Almost always, the field of scalars is the field of real or complex numbers.
 
Last edited:
  • #13
yes thank you. I probably should have said k = real numbers in my discussion above.
 
Last edited:

Similar threads

  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 48 ·
2
Replies
48
Views
9K
  • · Replies 20 ·
Replies
20
Views
10K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K