Uniquely Determined Linear Operator

In summary, the problem is that the author is trying to prove that a linear operator is bonded by saying that it is uniquely determined and bounded. However, it is not clear how these two claims are related. The author also has a problem with the proof, as one cannot always deduce that a vector is equal to zero from a linear transformation.
  • #1
Sudharaka
Gold Member
MHB
1,568
1
Hi everyone, :)

Here's a problem that I want to confirm my answer. Note that for the second part of the question it states, "prove that \(T\) is bonded by the above claim". I used a different method and couldn't find a method that relates the first part to prove the second.

Problem:

Suppose \(X\) is a n-dimensional linear vector space. Prove that any linear operator \(T\) on \(X\) is uniquely determined by \(\{Tx_i\}_{i=1}^{n}\) with \(\{x_i\}_{i=1}^{n}\) a basis for \(X\). Moreover, prove that \(T\) is bounded by the above claim.

My Ideas:

Suppose there are two representations of \(Tv\) where \(v\in V\). That is,

\[Tv=a_1 Tx_1+\cdots+a_n Tx_n=b_1 Tx_1+\cdots+b_n Tx_n\]

\[(a_1 Tx_1+\cdots+a_n Tx_n)-(b_1 Tx_1+\cdots+b_n Tx_n)=0\]

Since \(T\) is linear,

\[T((a_1-b_1) x_1+(a_2-b_2) x_2+\cdots+(a_n-b_n) x_n)=0\]

\[(a_1-b_1) x_1+(a_2-b_2) x_2+\cdots+(a_n-b_n) x_n=0\]

Since \(\{x_i\}_{i=1}^{n}\) is linearly independent,

\[a_i=b_i\mbox{ for all }i=1,\,2,\,\cdots,\,n\]

That is, \(T\) is uniquely determined by \(\{Tx\}_{i=1}^{n}\).

Now we shall show that \(T\) is bounded. Since \(X\) is n-dimensional, \(X\) is topologically isomorphic. That is there exist two positive constants \(c_1\) and \(c_2\) such that,

\[c_1\|x\|\leq \|Tx\|\leq c_2 \|x\|\]

for all \(x\in X\). Hence it's obvious that \(T\) is bounded.
 
Physics news on Phys.org
  • #2
There is a problem with your proof:

One cannot, in general, deduce from $Tu = 0$ that $u = 0$.

I will give an example in $\Bbb R^3$, to illustrate:

Suppose $T(a,b,c) = (a,b,0)$. We have (for example), $T(1,1,1) = T(1,1,0)$, but clearly it is not the case that:

$(1,1,1) = (1,1,0)$.

In other words, $T(0,0,1) = (0,0,0)$, but $(0,0,1)$ is not the 0-vector.

What you need to show is not that $\{Tx_i\}$ is linearly independent (it may NOT be), but rather that it spans the image space. Thus we can choose some SUBSET of $\{Tx_i\}$ to be a basis for $T(X)$. We are unconcerned with elements of $X - T(X)$ since those HAVE no representation as linear combinations of the $Tx_i$.

The second half of your proof is also a bit confusing: what is it you are claiming $T(X)$ is topologically isomorphic TO? It certainly is NOT $T(X)$, unless $T$ is bijective.
 
  • #3
Deveno said:
There is a problem with your proof:

One cannot, in general, deduce from $Tu = 0$ that $u = 0$.

I will give an example in $\Bbb R^3$, to illustrate:

Suppose $T(a,b,c) = (a,b,0)$. We have (for example), $T(1,1,1) = T(1,1,0)$, but clearly it is not the case that:

$(1,1,1) = (1,1,0)$.

In other words, $T(0,0,1) = (0,0,0)$, but $(0,0,1)$ is not the 0-vector.

What you need to show is not that $\{Tx_i\}$ is linearly independent (it may NOT be), but rather that it spans the image space. Thus we can choose some SUBSET of $\{Tx_i\}$ to be a basis for $T(X)$. We are unconcerned with elements of $X - T(X)$ since those HAVE no representation as linear combinations of the $Tx_i$.

Thanks so much for the reply. I understand perfectly about the fault in proof. I was rather confused by the term "uniquely determined" and thought that I should show that it's linearly independent. But then why don't they just use the word "span" instead of "uniquely determined"? Is there a difference between the two? Hope you can explain this to me. :)

Then the first part of the problem becomes quite easy, I guess. Take any vector \(v\in X\). Since \(\{x_i\}_{i=1}^{n}\) is a basis of \(X\) we have,

\[T(v)=T(a_1 x_1+\cdots+a_n x_n)=a_1 T(x_1)+\cdots+a_n T(x_n)\]

Therefore \(\{Tx_i\}_{i=1}^{n}\) spans \(\mbox{Img }T\). Am I correct?

Deveno said:
The second half of your proof is also a bit confusing: what is it you are claiming $T(X)$ is topologically isomorphic TO? It certainly is NOT $T(X)$, unless $T$ is bijective.

When I am writing the answer I took \(T\) maps \((X,\,\|\cdot\|_X)\) into \((Y,\,\|\cdot\|)_Y\). And since both are \(n-dimensional\) \(X\) and \(Y\) are topologically isomorphic. But there is a mistake in my original post. The last inequality should be,

\[c_1\|x\|_X\leq \|Tx\|_Y\leq c_2 \|x\|_X\]

I also figure out now that there's a obvious mistake here. We don't know whether \(Y\) is n-dimensional or not. I have to start from the beginning. :)

Edit: I think I got it. There's another theorem;

Let \((X,\,\|\cdot\|_X)\) and \((Y,\,\|\cdot\|)_Y\) be two normed linear spaces over \(F\). If \(X\) is finite dimensional, then any linear transformation \(T:\, X\rightarrow Y\) is bounded.

We can use this theorem to show that \(T\) is bounded. Am I correct? :)
 
  • #4
I'd like to address the first part of the problem, and then we can move on to the second part.

It's not generally true for a function $f:X \to Y$ that the value of $f$ on a finite subset of $X$ completely determines $f$. For example, when $X = Y = \Bbb R$ (and these are both vector spaces), there are an infinite number of functions with:

$f(0) = a$
$f(1) = b$

no matter how we choose $a$ and $b$.

So linear functions are very special in this regard: for a vector space of dimension 2, for example, we only need the value of a linear function at 2 (linearly independent) points to know which one we have. This is essentially saying that a linear function $T: X \to X$ is completely determined by its matrix relative to any basis for $X$.

You seem to be focusing on the values of $T$ and arguing these have only "one representation". This isn't what you want to do (and it's not TRUE). What you want to do is THIS:

Suppose that $S,T$ are two linear functions with:

$S(x_i) = T(x_i),\ i = 1,2,\dots,n$.

Then, for any $v \in X$:

$S(v) = S(c_1x_1 + \cdots c_nx_n) = c_1S(x_1) + \cdots + c_nS(x_n)$

$= c_1T(x_1) + \cdots + c_nT(x_n) = T(c_1x_1 + \cdots + c_nx_n) = T(v)$

so we conclude $S = T$.

Before we move on to part 2, make sure you understand this. Also, the notion of boundedness generally requires a norm...what norm are we using for $X$ (as it is not given by the problem)?
 
  • #5
Deveno said:
I'd like to address the first part of the problem, and then we can move on to the second part.

It's not generally true for a function $f:X \to Y$ that the value of $f$ on a finite subset of $X$ completely determines $f$. For example, when $X = Y = \Bbb R$ (and these are both vector spaces), there are an infinite number of functions with:

$f(0) = a$
$f(1) = b$

no matter how we choose $a$ and $b$.

So linear functions are very special in this regard: for a vector space of dimension 2, for example, we only need the value of a linear function at 2 (linearly independent) points to know which one we have. This is essentially saying that a linear function $T: X \to X$ is completely determined by its matrix relative to any basis for $X$.

You seem to be focusing on the values of $T$ and arguing these have only "one representation". This isn't what you want to do (and it's not TRUE). What you want to do is THIS:

Suppose that $S,T$ are two linear functions with:

$S(x_i) = T(x_i),\ i = 1,2,\dots,n$.

Then, for any $v \in X$:

$S(v) = S(c_1x_1 + \cdots c_nx_n) = c_1S(x_1) + \cdots + c_nS(x_n)$

$= c_1T(x_1) + \cdots + c_nT(x_n) = T(c_1x_1 + \cdots + c_nx_n) = T(v)$

so we conclude $S = T$.

Thanks for the detailed explanation. I think I am getting the intuition behind the phrase "uniquely determined" after reading your post. However there's one little point I want to clarify.

So given the set of values \(\{Tx_i\}_{i=1}^{n}\), only one linear transformation \(T\) exists with these values as it's image with respect to the basis \(\{x_i\}_{i=1}^{n}\). Is this statement correct? :)

In your proof you have taken two linear transformations \(S\) and \(T\) and shown that they are the same. However isn't it more appropriate to let, \(\{T(x_i):\,i=1,\,\cdots,\,n\}=\{S(x_j):\, j=1,\,\cdots,\,n\}\). ? I mean, for example, it could be that \(Tx_1=Sx_2,\,Tx_2=Sx_1\) and so on. Or is there a necessity that \(Tx_i=Sx_i\) for all \(i\). ? Did you get what I meant? :)

Deveno said:
Before we move on to part 2, make sure you understand this. Also, the notion of boundedness generally requires a norm...what norm are we using for $X$ (as it is not given by the problem)?

Yes, I thought about this too. That why I took a general norm without specifying anything in particular. But then after your question I tried to find a norm for this vector space and I came up with the following.

\[\|x\|=\sum_{i=1}^{n}|a_i|\]

where \(x=\sum_{i=1}^{n}a_{i}x_i\). How about that, is it correct? :)
 
  • #6
Um, no...the indices have to match. Take a simple example:

Let $X = \Bbb R^2$ with:

$T = I = \text{id}_X$ (that is, $Tv = v$ for all $v \in \Bbb R^2$), and let:

$S(x,y) = (y,x)$. For the basis $B = \{(1,0),(0,1)\}$ we certainly have:

$T(1,0) = S(0,1)$
$T(0,1) = S(1,0)$, but these are clearly not the same linear transformation.

As to the norm question, there are MANY possible norms on a given vector space. It's very likely that the "usual" or "p-2 norm" (aka the Euclidean norm) is what is intended:

For $x = (x_1,\dots,x_n)$, we have:

$\displaystyle \|x\| = \left(\sum_{i = 1}^n x_i^2 \right)^{\frac{1}{2}}$

The norm you have given is also known as the "p-1 norm" or "taxi-cab norm" (or sometimes the "Manhattan norm"). We also get a norm for any real number $p \geq 1$ by taking:

$\displaystyle \|x\| = \left(\sum_{i = 1}^n x_i^p \right)^{\frac{1}{p}}$

and if we take the limit as $p \to \infty$, we get the "maximum norm":

$\|x\| = \text{max}(|x_1|,\dots,|x_n|)$.

It is not clear which of these norms is intended, of if the problem is asking you to show that $T$ is (locally) bounded for ANY norm, which is a slightly different question.
 
  • #7
Deveno said:
Um, no...the indices have to match. Take a simple example:

Let $X = \Bbb R^2$ with:

$T = I = \text{id}_X$ (that is, $Tv = v$ for all $v \in \Bbb R^2$), and let:

$S(x,y) = (y,x)$. For the basis $B = \{(1,0),(0,1)\}$ we certainly have:

$T(1,0) = S(0,1)$
$T(0,1) = S(1,0)$, but these are clearly not the same linear transformation.

As to the norm question, there are MANY possible norms on a given vector space. It's very likely that the "usual" or "p-2 norm" (aka the Euclidean norm) is what is intended:

For $x = (x_1,\dots,x_n)$, we have:

$\displaystyle \|x\| = \left(\sum_{i = 1}^n x_i^2 \right)^{\frac{1}{2}}$

The norm you have given is also known as the "p-1 norm" or "taxi-cab norm" (or sometimes the "Manhattan norm"). We also get a norm for any real number $p \geq 1$ by taking:

$\displaystyle \|x\| = \left(\sum_{i = 1}^n x_i^p \right)^{\frac{1}{p}}$

and if we take the limit as $p \to \infty$, we get the "maximum norm":

$\|x\| = \text{max}(|x_1|,\dots,|x_n|)$.

It is not clear which of these norms is intended, of if the problem is asking you to show that $T$ is (locally) bounded for ANY norm, which is a slightly different question.

Thanks so much for all your help. :) I think I understand this completely now. I think the question expects us to show that the linear map is bounded for any norm which is in fact a theorem given in our recommended textbook. :)
 
  • #8
For FINITE-DIMENSIONAL vector spaces, (local) boundedness is a consequence of continuity (at 0, and thus everywhere). I remark in passing that the definition of continuity is also dependent upon the norm (one uses the norm to establish a metric, and then uses the definition of continuity in a metric space...otherwise you lack quantities to be less than your epsilons and deltas).

In particular, by choosing a delta-ball centered at the origin that is small enough radius, we can ensure that $\|T(x)\| \leq 1$ for all $x$ with $\|x\| \leq \delta$ (in other words, we pick an epsilon equal to 1).

Then for ANY $v \in X$, we have:

$\displaystyle \|T(v)\| = \left\|\frac{\|v\|}{\delta}T\left(\delta \frac{v}{\|v\|}\right)\right\|$

$\displaystyle = \frac{\|v\|}{\delta}\left\|T\left(\delta \frac{v}{\|v\|}\right)\right\|$

$\displaystyle \leq \frac{1}{\delta}\|v\|$

so we may take $M = \dfrac{1}{\delta}$, which established local boundedness, and depends only on the generic property every norm has, namely:

$\|\alpha v\| = |\alpha|\cdot \|v\|$
 

1. What is a Uniquely Determined Linear Operator?

A uniquely determined linear operator is a mathematical function that maps a vector space onto itself in a linear manner. This means that for every input vector, there is only one output vector, and the operator follows the rules of linearity, such as preserving scalar multiplication and addition.

2. How is a Uniquely Determined Linear Operator different from a general linear operator?

A uniquely determined linear operator is a subset of the larger set of general linear operators. The key difference is that a uniquely determined linear operator has only one possible output for each input, while a general linear operator may have multiple outputs for a single input.

3. What are some examples of Uniquely Determined Linear Operators?

Some examples of uniquely determined linear operators include the identity operator, which maps every vector to itself, and the zero operator, which maps every vector to the zero vector. Other examples include rotations and reflections in a vector space.

4. How is the invertibility of a Uniquely Determined Linear Operator determined?

A uniquely determined linear operator is invertible if and only if it has an inverse operator that maps the output vector back to the original input vector. In other words, if there exists a linear operator that "undoes" the original operator's action, then the operator is invertible.

5. What are some applications of Uniquely Determined Linear Operators?

Uniquely determined linear operators have many applications in mathematics, physics, and engineering. They are used to model and solve linear systems, analyze geometric transformations, and study linear differential equations. Additionally, they are essential in quantum mechanics, where they represent physical observables and their corresponding operators.

Similar threads

Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
Replies
3
Views
2K
Replies
22
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
302
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
Replies
1
Views
674
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top