MHB Find Linear Mapping: Help Solving $V \setminus (W_1 \cup \cdots \cup W_m)$

  • Thread starter Thread starter evinda
  • Start date Start date
  • Tags Tags
    Linear Mapping
evinda
Gold Member
MHB
Messages
3,741
Reaction score
0
Hello! (Wave)

Let $F$ be a field with infinite elements and $V$ a $F$-linear space of dimension $n$ and $W_1, \dots, W_m$ subspaces of $V$ of dimension $n_i<n, i=1, \dots, m$. We want to show that $V \setminus{(W_1 \cup \cdots \cup W_m)} \neq \varnothing$.

  1. Fix a basis $\{ v_1, \dots, v_n\}$ of $V$. Show that there is a non-zero linear mapping $\ell_k: V \to F$, such that $W_k \subset ker(\ell_k)$ (i.e. $w \in W \Rightarrow \ell_k(w)=0$).
  2. Construct a non-zero polynomial $f_k(X_1, \dots, X_n) \in F[X_1, \dots, X_n]$ such that $x_1 v_1+\dots+ x_n v_n \in W_k \Rightarrow f_k(x_1, \dots, x_k)=0$.
  3. Consider as given that if $f \in F[X_1, \dots, X_n]$ is a non-zero polynomial, then there is a point $(a_1, \dots, a_n) \in F^n$ such that $f(a_1, \dots, a_n) \neq 0$. Show that there is a vector $v \in V \setminus (W_1 \cup \dots \cup W_m)$.

Could you help me to find the desired linear mapping $\ell_k$ ? (Thinking)
 
Physics news on Phys.org
evinda said:
Hello! (Wave)

Let $F$ be a field with infinite elements and $V$ a $F$-linear space of dimension $n$ and $W_1, \dots, W_m$ subspaces of $V$ of dimension $n_i<n, i=1, \dots, m$. We want to show that $V \setminus{(W_1 \cup \cdots \cup W_m)} \neq \varnothing$.

  1. Fix a basis $\{ v_1, \dots, v_n\}$ of $V$. Show that there is a non-zero linear mapping $\ell_k: V \to F$, such that $W_k \subset ker(\ell_k)$ (i.e. $w \in W \Rightarrow \ell_k(w)=0$).
  2. Construct a non-zero polynomial $f_k(X_1, \dots, X_n) \in F[X_1, \dots, X_n]$ such that $x_1 v_1+\dots+ x_n v_n \in W_k \Rightarrow f_k(x_1, \dots, x_k)=0$.
  3. Consider as given that if $f \in F[X_1, \dots, X_n]$ is a non-zero polynomial, then there is a point $(a_1, \dots, a_n) \in F^n$ such that $f(a_1, \dots, a_n) \neq 0$. Show that there is a vector $v \in V \setminus (W_1 \cup \dots \cup W_m)$.

Could you help me to find the desired linear mapping $\ell_k$ ? (Thinking)

Hey evinda!

How about we start with an example and see if we can generalize from there?
For instance, what happens if we pick $n=2,\ v_1=(1,0),\ v_2=(0,1)$ and $m=1,\ W_1=\langle (1,1)\rangle$?
Can we find $\ell_k$? (Wondering)
 
Klaas van Aarsen said:
Hey evinda!

How about we start with an example and see if we can generalize from there?
For instance, what happens if we pick $n=2,\ v_1=(1,0),\ v_2=(0,1)$ and $m=1,\ W_1=\langle (1,1)\rangle$?
Can we find $\ell_k$? (Wondering)

We have that $w=a(1,1)$, or not? We want to find $\ell_k$ such that $\ell_k(w)=0 \Rightarrow \ell_k(a\cdot (1,1))=0 \Rightarrow a\cdot \ell_k (1,1)=0 \Rightarrow \ell_k(1,1)=0$.

But how can we define $\ell_k$ ? (Thinking)
 
evinda said:
We have that $w=a(1,1)$, or not? We want to find $\ell_k$ such that $\ell_k(w)=0 \Rightarrow \ell_k(a\cdot (1,1))=0 \Rightarrow a\cdot \ell_k (1,1)=0 \Rightarrow \ell_k(1,1)=0$.

But how can we define $\ell_k$ ? (Thinking)

We can define a linear map between vector spaces by the images of a basis.
Or equivalently by defining a matrix.
Can we do that for the example? (Wondering)
 
Klaas van Aarsen said:
We can define a linear map between vector spaces by the images of a basis.
Or equivalently by defining a matrix.
Can we do that for the example? (Wondering)

Do you mean that $\ell_k$ is defined for example as $\ell_k(v)=v\cdot \begin{pmatrix}-1 \\ 1\end{pmatrix}$ ? (Thinking)
 
evinda said:
Do you mean that $\ell_k$ is defined for example as $\ell_k(v)=v\cdot \begin{pmatrix}-1 \\ 1\end{pmatrix}$ ? (Thinking)

Yep. (Nod)
 
Klaas van Aarsen said:
Yep. (Nod)

Ok... But how can we show that such a mapping exists in the general case? :confused:
 
evinda said:
Ok... But how can we show that such a mapping exists in the general case? :confused:

How did you find $\ell_k$ exactly just now? (Wondering)

Let's ramp it up.
Suppose we have $n=3,\ v_1=(1,0,0),\ v_2=(0,1,0),\ v_3=(0,0,1)$, and $m=2,\ W_1=\langle(1,1,0)\rangle,\ W_2=\langle(1,2,3),(0,0,1)\rangle$.
What will $\ell_1$ and $\ell_2$ be? (Wondering)
 
Klaas van Aarsen said:
How did you find $\ell_k$ exactly just now? (Wondering)

Let's ramp it up.
Suppose we have $n=3,\ v_1=(1,0,0),\ v_2=(0,1,0),\ v_3=(0,0,1)$, and $m=2,\ W_1=\langle(1,1,0)\rangle,\ W_2=\langle(1,2,3),(0,0,1)\rangle$.
What will $\ell_1$ and $\ell_2$ be? (Wondering)

In each case it is a matrix such that the result of the multiplication is equal to $0$, or not? :confused:
 
  • #10
evinda said:
In each case it is a matrix such that the result of the multiplication is equal to $0$, or not? :confused:

The result of multiplication with a vector from $W_k$ must be 0, and there must be at least 1 vector that is not in $W_k$ with a non-zero result. (Thinking)
 
  • #11
Klaas van Aarsen said:
The result of multiplication with a vector from $W_k$ must be 0, and there must be at least 1 vector that is not in $W_k$ with a non-zero result. (Thinking)

Ok... But how are we sure that such a matrix exists? :confused:
 
  • #12
evinda said:
Ok... But how are we sure that such a matrix exists? :confused:

In the example $\mathbf w_1=\mathbf v_1 + \mathbf v_2=(1,1)$ is a basis for $W_1$.
We can extend it to a basis for $V$ by adding for instance $\mathbf w_2=\mathbf v_2=(0,1)$, which is not in $W_1$.
We can now define $\ell_1$ such that $\ell_1(\mathbf w_1)=0$ and $\ell_1(\mathbf w_2)=1$, can't we?
Then we have a non-zero linear map that maps $W_1$ to $0$.
This is already sufficient to show a non-linear map exists, isn't it? (Wondering)

If we want, we can define a matrix as well.
Let $L$ be the matrix that identifies $\ell_1$.
Then $L\mathbf w_1 = 0$ and $L\mathbf w_2=1$.
In matrix notation:
$$\begin{align*}L\begin{pmatrix}\mathbf w_1 & \mathbf w_2\end{pmatrix} = \begin{pmatrix}0 & 1\end{pmatrix}&\Rightarrow\quad
L = \begin{pmatrix}0 & 1\end{pmatrix}\begin{pmatrix}\mathbf w_1 & \mathbf w_2\end{pmatrix}^{-1}\quad\Rightarrow\quad
L = \begin{pmatrix}0 & 1\end{pmatrix}\begin{pmatrix}1 & 0\\ 1& 1\end{pmatrix}^{-1}\\ &\Rightarrow\quad
L = \begin{pmatrix}0 & 1\end{pmatrix}\begin{pmatrix}1 & 0\\ -1& 1\end{pmatrix}\quad\Rightarrow\quad
L = \begin{pmatrix}-1 & 1\end{pmatrix}
\end{align*}$$
We can always construct the matrix like this, so we can be sure it exists can't we? (Wondering)
 
  • #13
So for the general case do we take as $L$ the product of the $1\times n$ matrix that contains $1$ in one position and every other element is equal to $0$, and the inverse matrix that contains as columns the vectors of $W_k$ ? (Thinking)
 
  • #14
evinda said:
So for the general case do we take as $L$ the product of the $1\times n$ matrix that contains $1$ in one position and every other element is equal to $0$,

Yep. (Nod)

evinda said:
and the inverse matrix that contains as columns the vectors of $W_k$ ?

We need to extend them to $n$ vectors.
Since $W_k$ has $n_k < n$ dimensions, there must be at least $n-n_k$ vectors in the basis of $V$ that are not in $W_k$, so we can use those.
Furthermore, the positions of the vectors of $W_k$ must correspond to zeros in the $1\times n$ matrix. (Thinking)
 
  • #15
I think I understood. Could you maybe also give me a hint about the desired $f$ at (2)? (Thinking)
 
  • #16
evinda said:
I think I understood. Could you maybe also give me a hint about the desired $f$ at (2)? (Thinking)

How far do we get with the $\ell_k$ that we have just found? (Wondering)
 
  • #17
Klaas van Aarsen said:
How far do we get with the $\ell_k$ that we have just found? (Wondering)

Do we take as $x_1 v_1+\dots+ x_n v_n$ the $w$ of (1) and then as $f_k(x_1, \dots, x_k)=0$ we consider $\ell_k (w)$ ? :confused:
 
  • #18
evinda said:
Do we take as $x_1 v_1+\dots+ x_n v_n$ the $w$ of (1) and then as $f_k(x_1, \dots, x_k)=0$ we consider $\ell_k (w)$ ?

Yep. (Nod)
 
  • #19
At (3) if there is no $v \in V \setminus (W_1 \cup \dots \cup W_m)$ then all elements are in a $W_i$ and then from (2) the non-zero function $f$ must be $0$ and so we get a contradiction with the assumption of (3).

Is the idea correct and complete? (Thinking)
 
  • #20
evinda said:
At (3) if there is no $v \in V \setminus (W_1 \cup \dots \cup W_m)$ then all elements are in a $W_i$ and then from (2) the non-zero function $f$ must be $0$ and so we get a contradiction with the assumption of (3).

Is the idea correct and complete?

In (2) we only had a non-zero function $f_k$ for $W_k$.
Using that function $f_k$ and the given statement from (3), we can only conclude that there is a $v \in V \setminus W_k$ for a specific $k$, can't we? (Thinking)

What we need is that we can find a $v$ such that none of the $\ell_k(v)$ is zero.
Or equivalently that we can find a point $(a_1,...,a_n)$, such that $f_k(a_1,...,a_n) \ne 0$ for each $k$.

Can we combine all polynomials $f_k(x_1,...,x_n)$ into one polynomial such that if even one of them is zero, that the combined polynomial is also zero? (Wondering)
 
  • #21
Klaas van Aarsen said:
In (2) we only had a non-zero function $f_k$ for $W_k$.
Using that function $f_k$ and the given statement from (3), we can only conclude that there is a $v \in V \setminus W_k$ for a specific $k$, can't we? (Thinking)

What we need is that we can find a $v$ such that none of the $\ell_k(v)$ is zero.
Or equivalently that we can find a point $(a_1,...,a_n)$, such that $f_k(a_1,...,a_n) \ne 0$ for each $k$.

Can we combine all polynomials $f_k(x_1,...,x_n)$ into one polynomial such that if even one of them is zero, that the combined polynomial is also zero? (Wondering)

So, we take the product of all $f_k, \ \forall k$, right? (Thinking)
 
  • #22
evinda said:
So, we take the product of all $f_k, \ \forall k$, right?

Indeed. (Cool)
 
  • #23
Klaas van Aarsen said:
Indeed. (Cool)

Nice, thank you... (Happy)
 
Back
Top