Understanding Extension of Scalars in a Vector Space

In summary, the low-tech complexification procedure makes a vector space into a complex vector space by defining addition and product component-wise and linearly. This gives useful information about a linear transformation, such as knowing that the eigenvalue is given by $\lambda +i\mu$.
  • #1
caffeinemachine
Gold Member
MHB
816
15
$\newcommand{\R}{\mathbf R}\newcommand{\C}{\mathbf C}$

Low-Tech Complexification:
Let $V$ be a finite dimensional vector space over $\R$. We can forcefully make $W:=V\times V$ into a complex vector space by defining addition component-wise and product $\C\times W\to W$ as
$$
(a+ib)(u, v)=(au-bv, av+bu)
$$
for all $a, b\in \R$ and $u, v\in V$.
The vector space axioms can be readily checked.

Also, a linear transformation $T:V\to V$ induces a linear map $\bar T:W\to W$ as $\bar T(u, v)= (Tu,Tv)$ for all $u, v\in V$.
From here, interesting information about $T$ can obtained. For example, $\bar T$ has an eigenvalue. So there is $(0, 0)\neq (u, v)\in W$ such that
$$
\bar T(u, v)=(\lambda +i\mu)(u, v)=(Tu, Tv)
$$
which gives, $Tu=\lambda u-\mu v$ and $Tv=\lambda v+\mu u$.
So there are vectors $u$ and $v$ in $V$, not both zero, such that the above equations hold. This wasn't obvious (to me) without making the above construction.
___
I want to see how the above relates with the standard notion of extension of scalars.

The wikipedia article on complexification(Complexification - Wikipedia, the free encyclopedia) talks about defining the tensor product $V\otimes_{\R} \C$ as giving it a complex vector space structure by defining $\alpha(v\otimes \beta)=v\otimes(\alpha\beta)$ for $\alpha, \beta\in \C$ and $v\in V$.

I can feebly see how the "low-tech" process described earlier is same as the tensorial construction. The vector $(u, v)\in W$ corresponds to $u\otimes 1+v\otimes i$, and for $T:V\to V$ we get $\bar T(v\otimes \alpha)=(Tv)\alpha$.

Admittedly, this transition is not obvious to me.

Can somebody please throw some light on how the two approaches are really the same? (Assuming they actually are same).

Thanks.
 
Physics news on Phys.org
  • #2
caffeinemachine said:
I can feebly see how the "low-tech" process described earlier is same as the tensorial construction. The vector $(u, v)\in W$ corresponds to $u\otimes 1+v\otimes i$, and for $T:V\to V$ we get $\bar T(v\otimes \alpha)=(Tv)\alpha$.
I don't see anything "feeble" about your mapping $(u,v) \mapsto u\otimes 1+v\otimes i$. In fact, that seems to give a very simple and precise correspondence between the low-tech and high-tech (tensorial) constructions of the complexification procedure. If you like, you can think of both constructions as being ways to make sense of the expression $u+vi$ for an element of the complexified space, just as a complex number can be considered in either of the forms $(a,b)$ or $a+bi$ (where $a,b\in\mathbb{R}$).
 
  • #3
Opalg said:
I don't see anything "feeble" about your mapping $(u,v) \mapsto u\otimes 1+v\otimes i$. In fact, that seems to give a very simple and precise correspondence between the low-tech and high-tech (tensorial) constructions of the complexification procedure. If you like, you can think of both constructions as being ways to make sense of the expression $u+vi$ for an element of the complexified space, just as a complex number can be considered in either of the forms $(a,b)$ or $a+bi$ (where $a,b\in\mathbb{R}$).

Hello opalg,

I did a poor job explaining my question.

The isomorphism between the low-tech and the hi-tech construction is clear. What I wanted to see was if the idea of tensoring with $\mathbf C$ is somehow a natural one. To me it is very unnatural as of now.
 
  • #4
caffeinemachine said:
What I wanted to see was if the idea of tensoring with $\mathbf C$ is somehow a natural one. To me it is very unnatural as of now.
The big advantage of having complex scalars is that $\mathbb{C}$ is algebraically closed. For example, in the finite-dimensional case, if you have a real $n\times n$ matrix $A$, you might think that it would "naturally" act on the real space $\mathbb{R}^n$. But then it need not have any eigenvectors. If you complexify $\mathbb{R}^n$ so that it becomes $\mathbb{C}^n$ then $A$ acts on the complexified space, where it is guaranteed to have eigenvectors.

At a more advnced level, a Google search for "complexification" will bring up explanations of how the theory of real Lie groups and algebras depends on their complexifications.
 
  • #5
Opalg said:
The big advantage of having complex scalars is that $\mathbb{C}$ is algebraically closed. For example, in the finite-dimensional case, if you have a real $n\times n$ matrix $A$, you might think that it would "naturally" act on the real space $\mathbb{R}^n$. But then it need not have any eigenvectors. If you complexify $\mathbb{R}^n$ so that it becomes $\mathbb{C}^n$ then $A$ acts on the complexified space, where it is guaranteed to have eigenvectors.

At a more advnced level, a Google search for "complexification" will bring up explanations of how the theory of real Lie groups and algebras depends on their complexifications.
Thanks Opalg.

I guess I just need more time.

One unrelated thing. Should the word "thereon" in your signature be really "thereof"?
 
  • #6
caffeinemachine said:
One unrelated thing. Should the word "thereon" in your signature be really "thereof"?
The german original has wovon [whereof] ... darüber [thereon]. If it had been wovon ... davon (using the same preposition von (meaning of) then I would agree that it should be translated whereof ... thereof. But über means on (or upon), so I thought it would be better to render it as thereon (or thereupon if you prefer).

Good question though – it makes a nice change from mathematics!
 
  • #7
Opalg said:
The german original has wovon [whereof] ... darüber [thereon]. If it had been wovon ... davon (using the same preposition von (meaning of) then I would agree that it should be translated whereof ... thereof. But über means on (or upon), so I thought it would be better to render it as thereon (or thereupon if you prefer).

Good question though – it makes a nice change from mathematics!
I don't know any German. Haha. But Danke for this!
 

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors, along with two operations, vector addition and scalar multiplication. These operations follow specific rules, such as closure, associativity, and distributivity, and allow for the manipulation and combination of vectors within the space.

2. What is a scalar in a vector space?

A scalar in a vector space is a single value, such as a number or variable, that can be multiplied by a vector to produce a new vector. Scalars are typically used to scale or stretch a vector, and they can be positive, negative, or zero.

3. How do scalars affect vectors in a vector space?

Scalars affect vectors in a vector space by changing their magnitude and/or direction. When a scalar is multiplied by a vector, the resulting vector is parallel to the original vector but has a different length. If the scalar is negative, the resulting vector will also have the opposite direction of the original vector.

4. What is the extension of scalars in a vector space?

The extension of scalars in a vector space refers to the concept of extending the set of scalars from real numbers to other types of numbers or mathematical structures. This allows for more general and abstract vector spaces, such as complex vector spaces, to be studied and utilized in various fields of mathematics and science.

5. Why is understanding extension of scalars important in science?

Understanding extension of scalars is important in science because it allows for the use of more advanced and abstract mathematical tools to analyze and model physical phenomena. For example, complex vector spaces are often used in quantum mechanics to describe the behavior of particles, and quaternionic vector spaces are used in computer graphics to represent rotations in 3D space. By extending the concept of scalars, scientists can better understand and describe the world around us.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
257
  • Linear and Abstract Algebra
Replies
3
Views
307
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
13
Views
1K
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Differential Geometry
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
1K
Back
Top