Confusion with Tensors: Understanding Finite-Dimensional Vector Spaces

  • Context: Graduate 
  • Thread starter Thread starter JonnyG
  • Start date Start date
  • Tags Tags
    Confusion Tensors
Click For Summary

Discussion Overview

The discussion revolves around the understanding of tensors, specifically in the context of finite-dimensional vector spaces and their dimensions. Participants explore the relationships between different sets of linear transformations and bilinear maps, addressing potential misunderstandings in dimensionality and definitions.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant defines a k-tensor as a multilinear map from a k-fold product of a finite-dimensional vector space to the reals, noting that this definition is specific to their textbook.
  • Another participant asserts that the correspondence between the set of linear transformations from \(\mathbb{R}^n\) to \(\mathbb{R}^n\) and the set of \(n \times n\) matrices is bijective, implying equal dimensions.
  • A participant expresses doubt about their proof that the dimension of the set of linear transformations is \(n\), suggesting it may be incorrect.
  • One participant questions the target space of \(L^2(\mathbb{R}^n)\), indicating that if it maps into the base field \(\mathbb{R}\), the dimension would be \(n^2\), but if it maps into another vector space, the dimension could differ.
  • Another participant clarifies that the original question pertains to linear operators and bilinear maps, distinguishing between \(L(\mathbb{R}^n, \mathbb{R}^n)\) and \(L^2(\mathbb{R}^n)\).
  • A later reply confirms that \(L^2(\mathbb{R}^n)\) maps into the reals, leading to a realization of a mistake in the earlier proof regarding dimensionality.
  • A participant shares a corrected proof involving a basis for \(\mathbb{R}^n\) and the spanning of linear transformations, indicating that the previous misunderstanding stemmed from limiting the index in their reasoning.

Areas of Agreement / Disagreement

Participants express differing views on the dimensionality of the sets discussed, with some agreeing on the bijective correspondence between linear transformations and matrices, while others question the implications of this relationship. The discussion remains unresolved regarding the correctness of the initial proof and the interpretation of dimensions.

Contextual Notes

There are limitations in the discussion regarding the assumptions about the target spaces of the bilinear maps and the definitions of the sets involved, which may affect the conclusions drawn about their dimensions.

JonnyG
Messages
233
Reaction score
45
First let me give the definition of tensor that my book gives:

If [itex]V[/itex] is a finite dimensional vector space with [itex]dim(V) = n[/itex] then let [itex]V^{k}[/itex] denote the k-fold product. We define a k-tensor as a map [itex]T: V^{k} \longrightarrow \mathbb{R}[/itex] such that [itex]T[/itex]is multilinear, i.e. linear in each variable if all the other variables are held fixed. I know there are more general definitions but since this is the one I am using in my book, let's stick with this one.

Okay, now here is my problem. First off, assume from this point on that [itex]\mathbb{R}^n[/itex] has the usual basis. If [itex]L^{2}(\mathbb{R}^{n})[/itex] is the set of all 2-tensors on [itex]\mathbb{R}^n[/itex] then it has a dimension of [itex]n^2[/itex]. If [itex]M(n,n)[/itex] is the set of all n by n matrices with real entries then we have [itex]L^2(\mathbb{R}^n) \cong M(n,n)[/itex].

However, let [itex]L(\mathbb{R}^n, \mathbb{R}^n)[/itex] be the set of all linear transformations [itex]f: \mathbb{R}^n \longrightarrow \mathbb{R}^n[/itex]. It seems obvious to me that [itex]L(\mathbb{R}^n, \mathbb{R}^n) \cong M(n,n)[/itex]. But then this would imply that [itex]L^2(\mathbb{R}^n) \cong L(\mathbb{R^n}, \mathbb{R}^n)[/itex] which is impossible since [itex]dim(L^2(\mathbb{R}^n)) = n^2[/itex] and [itex]dim(L(\mathbb{R}^n, \mathbb{R}^n)) = n[/itex] (I proved its dimension is [itex]n[/itex] and I am sure the proof is correct).

Where have I gone wrong?
 
Physics news on Phys.org
The correspondence between ##L(\mathbb R^n,\mathbb R^n)## and ##M(n,n)## is bijective. So these vector spaces must have the same dimension.
 
That's what I thought as well. So this must just mean my "proof" that [itex]dim(L(\mathbb{R}^n, \mathbb{R}^n)) = n[/itex] is incorrect.
 
Yes, I agree.
 
JonnyG said:
First let me give the definition of tensor that my book gives:

If [itex]V[/itex] is a finite dimensional vector space with [itex]dim(V) = n[/itex] then let [itex]V^{k}[/itex] denote the k-fold product. We define a k-tensor as a map [itex]T: V^{k} \longrightarrow \mathbb{R}[/itex] such that [itex]T[/itex]is multilinear, i.e. linear in each variable if all the other variables are held fixed. I know there are more general definitions but since this is the one I am using in my book, let's stick with this one.

Okay, now here is my problem. First off, assume from this point on that [itex]\mathbb{R}^n[/itex] has the usual basis. If [itex]L^{2}(\mathbb{R}^{n})[/itex] is the set of all 2-tensors on [itex]\mathbb{R}^n[/itex] then it has a dimension of [itex]n^2[/itex]. If [itex]M(n,n)[/itex] is the set of all n by n matrices with real entries then we have [itex]L^2(\mathbb{R}^n) \cong M(n,n)[/itex].

However, let [itex]L(\mathbb{R}^n, \mathbb{R}^n)[/itex] be the set of all linear transformations [itex]f: \mathbb{R}^n \longrightarrow \mathbb{R}^n[/itex]. It seems obvious to me that [itex]L(\mathbb{R}^n, \mathbb{R}^n) \cong M(n,n)[/itex]. But then this would imply that [itex]L^2(\mathbb{R}^n) \cong L(\mathbb{R^n}, \mathbb{R}^n)[/itex] which is impossible since [itex]dim(L^2(\mathbb{R}^n)) = n^2[/itex] and [itex]dim(L(\mathbb{R}^n, \mathbb{R}^n)) = n[/itex] (I proved its dimension is [itex]n[/itex] and I am sure the proof is correct).

Where have I gone wrong?

I think you need to specify the target space of ##L^2 (\mathbb R^n, \mathbb R^n) ## , i.e., bilinear maps into what space? If it is into the base field ## \mathbb R ## of dimension 1 as a v.space over itself, then the dimension is ## n^2## as you said, if the target space is another vector space V, then ## Dim L^2 (\mathbb R^n , \mathbb R^n, V ) ## is the product of the individual dimensions. Sorry if this is obvious and I misread what you meant.
 
He never mentioned ##L^2(\mathbb R^n,\mathbb R^n)##. The question was about ##L(\mathbb R^n,\mathbb R^n)## (linear operators on ##\mathbb R^n##) and ##L^2(\mathbb R^n)## (bilinear maps from ##\mathbb R^n\times\mathbb R^n## into ##\mathbb R##).
 
OK. Is ##L^2(\mathbb R^n)## supposed to be maps into ##\mathbb R##, or into any vector space ( as in the case of matrix multiplication)?
 
[itex]L^2(\mathbb{R}^n)[/itex] maps into the reals. I see the mistake in my "proof" that led to this confusion. It all makes sense now.
 
Just curious, what was it?
 
  • #10
WWGD said:
Just curious, what was it?

***Sorry about the crappy LaTex ***

Well the correct proof would be to fix the usual basis for [itex]\mathbb{R}^n[/itex]. Let [itex]T \in L(\mathbb{R}^n,\mathbb{R}^n)[/itex]. Let [itex]\phi_{j,i} (e_i) = e_j[/itex] but be 0 on all other basis vectors. Note that [itex]1 \le j \le n, 1 \le i \le n[/itex]. We show that the [itex]\phi_{j,i}[/itex] span [itex]L(\mathbb{R}^n,\mathbb{R}^n)[/itex]. We know that [itex]T(e_j) = \sum_{j = 1}^n d_{j,i} e_j[/itex] for some scalars [itex]d_{j,i} \in \mathbb{R}[/itex].

So [itex]T(e_1) = d_{1,1} \phi_{1,1} (e_1) + d_{2,1} \phi_{2,1} (e_1) + d_{3,1} \phi_{3,1} (e_1) + \cdots + d_{n,1} \phi_{n,i} (e_n)[/itex]

We do this for each [itex]e_i[/itex]. Clearly then, [itex]T[/itex] is a linear combination of the [itex]\phi_{j,i}[/itex]. It is also clear that there are [itex]n^2[/itex] many of the [itex]\phi_{j,i}[/itex] and Linear independence is obvious.

The mistake I was making before was only allowing one index, so I ended up with only [itex]n[/itex] many of the [itex]\phi[/itex]
 
  • #11
Thanks; your Latex is fine.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 26 ·
Replies
26
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 10 ·
Replies
10
Views
2K