- #1

caffeinemachine

Gold Member

MHB

- 816

- 15

**Let $V$ be a finite dimensional vector space over $\R$. We can forcefully make $W:=V\times V$ into a complex vector space by defining addition component-wise and product $\C\times W\to W$ as**

Low-Tech Complexification:

Low-Tech Complexification:

$$

(a+ib)(u, v)=(au-bv, av+bu)

$$

for all $a, b\in \R$ and $u, v\in V$.

The vector space axioms can be readily checked.

Also, a linear transformation $T:V\to V$ induces a linear map $\bar T:W\to W$ as $\bar T(u, v)= (Tu,Tv)$ for all $u, v\in V$.

From here, interesting information about $T$ can obtained. For example, $\bar T$ has an eigenvalue. So there is $(0, 0)\neq (u, v)\in W$ such that

$$

\bar T(u, v)=(\lambda +i\mu)(u, v)=(Tu, Tv)

$$

which gives, $Tu=\lambda u-\mu v$ and $Tv=\lambda v+\mu u$.

So there are vectors $u$ and $v$ in $V$, not both zero, such that the above equations hold. This wasn't obvious (to me) without making the above construction.

___

**I want to see**how the above relates with the standard notion of

*extension of scalars*.

The wikipedia article on complexification(Complexification - Wikipedia, the free encyclopedia) talks about defining the tensor product $V\otimes_{\R} \C$ as giving it a complex vector space structure by defining $\alpha(v\otimes \beta)=v\otimes(\alpha\beta)$ for $\alpha, \beta\in \C$ and $v\in V$.

I can feebly see how the "low-tech" process described earlier is same as the tensorial construction. The vector $(u, v)\in W$ corresponds to $u\otimes 1+v\otimes i$, and for $T:V\to V$ we get $\bar T(v\otimes \alpha)=(Tv)\alpha$.

Admittedly, this transition is not obvious to me.

Can somebody please throw some light on how the two approaches are really the same? (Assuming they actually are same).

Thanks.