MHB Basis, dimension and vector spaces

Yankel
Messages
390
Reaction score
0
Hello all,

I have these two sets (I couldn't use the notation {} in latex, don't know how).

V is the set of matrices spanned by these 3 matrices written below. W is a set of 2x3 matrices applying the rule a+e=c+f

\[V=span(\begin{pmatrix} 1 &1 &1 \\ 1 &3 &7 \end{pmatrix},\begin{pmatrix} 0 &0 &0 \\ 1 &1 &1 \end{pmatrix},\begin{pmatrix} 0 &0 &0 \\ 0 &2 &3 \end{pmatrix})\]

\[W=(\begin{pmatrix} a &b &c \\ e &f &g \end{pmatrix}|a+e=c+f)\]

For both sets, I need to determine if it is a sub-vector space, I need to find the basis and dimension.

For W, I think it is fairly easy to prove it is a vector space (sub space - assuming R2x3 is a vector space, which I am allowed to do in this example). However I am not so sure about the dimension. Can I write a=c+f-e and say that dim(W)=5 ?

My bigger problem is V. I managed to show that a matrix in V is from the shape:

\[\begin{pmatrix} a &a &a \\ a+b &3a+b+2c &7a+b+3c \end{pmatrix}\]

but I am not sure how from here I show that it is a vector space (I am looking for the 3 rules: 0 matrix, + and scalar *). In addition, I don't know what the dimension is (my intuition say dim(V)=3)

Can you assist please ?

Thank you in advance !
 
Physics news on Phys.org
Yankel said:
...I couldn't use the notation {} in latex, don't know how...

Precede the curly braces with a backslash. For example:

\{A,B,C\} gives you $$\{A,B,C\}$$

and

\left\{\frac{1}{2},\frac{1}{3},\frac{1}{4} \right\} gives you $$\left\{\frac{1}{2},\frac{1}{3},\frac{1}{4} \right\}$$
 
Thanks, I'll try to remember it next time.

Any ideas about my question ?
 
Some general things that apply:

For starters, the set $\text{Mat}_{m \times n}(F)$ of all $m \times n$ matrices with entries in a field $F$ is always a vector space of dimension $mn$.

A subset $U$ of a vector space $V$ is a subspace if the following 3 statements hold:

1) $u,u' \in U \implies u+u' \in U$ (closure under vector addition)
2) $u \in V, \alpha \in F \implies \alpha u \in U$ (closure under scalar multiplication)
3) $0 \in U$ (this implies $U$ is non-empty, and conversely, if $U$ is non-empty, then: (2) implies for any $u \in U$, that $-u = (-1)u \in U$, and then (1) implies $u + -u = 0 \in U$, so (3) is sometimes replaced with the condition $U \neq \emptyset$)

For your first set, the span of any set automatically guarantees (1) and (2) (that's what span MEANS). So the real question is: is this set linearly independent, and if not, what is a maximal linearly independent subset? It's easiest in this case to proceed backwards. The set consisting of just the 3rd matrix is clearly linearly independent, since it is a non-zero matrix.

The set consisting of the last TWO matrices is linearly independent if and only if the second matrix is not a scalar multiple of the third (or vice versa). Since the 2,1-entry of the second matrix is 1, and the 2,1-entry of the third matrix is 0, the only way the second matrix could be a scalar multiple of the third is if it was the third matrix multiplied by 0. But this would make the second matrix 0, and it is NOT a 0-matrix, so the last two matrices form a linearly independent set.

Now all three matrices form a linearly independent set if and only if the first matrix is NOT a linear combination of the last two.

But ANY linear combination of the last two would have a 0 row as its first row, and the first row of the first matrix is NOT 0. Hence it is a linearly independent set, and thus is a basis for the set it spans, that is: $\text{dim}_{\ F}(V) = 3$.

To check if $W$ forms a subspace of $\text{Mat}_{2 \times 3}(\Bbb R)$ (which has dimension 6), let:

$A = \begin{bmatrix}a&b&c\\e&f&g \end{bmatrix}$

$A' = \begin{bmatrix}a'&b'&c'\\e'&f'&g' \end{bmatrix}$

be any two matrices belonging to $W$.

Then:

$A + A' = \begin{bmatrix}a+a'&b+b'&c+c'\\e+e'&f+f'&g+g' \end{bmatrix}$

Now we are given that:

$a+e = c+f$ and $a'+e' = c'+f'$, and we must show that:

$(a+a') + (e+e') = (c+c') + (f+f')$.

If you can do that, then that proves (1). I think that is enough to show you how to proceed with (2) and (3).

So if it IS a subspace we know that $1 \leq \text{dim}_{\ \Bbb R}(W) \leq 6$ (since it has non-zero matrices in it, for example the matrix consisting of all 1's).

One way to show it has a certain dimension, is to exhibit a basis with that number of elements. I'll get you started. Two linearly independent matrices in $W$ are:

$E_{12} = \begin{bmatrix}0&1&0\\0&0&0 \end{bmatrix}$

and:

$E_{23} = \begin{bmatrix}0&0&0\\0&0&1 \end{bmatrix}$

A third linearly independent element of $W$ is:

$E_{11} + E_{22} = \begin{bmatrix}1&0&0\\0&1&0 \end{bmatrix}$

This means that $3 \leq \text{dim}_{\ \Bbb R}(W) \leq 6$.

Prove that $\{E_{12},E_{23},E_{11}+E_{22},E_{13}+E_{22}\}$ is likewise linearly independent.

Since we can find some 2x3 matrices NOT in $W$, this narrows the dimension down to either 4 or 5. Can you find a 5th linearly independent matrix?
 
Thank you for your detailed answer, it is very helpful. I want to ask something about my view of things.

For the set W, can't I look at the condition a+e=c+f, and make it a=c+f-e, saying that only 1 out of 6 parameters can be expressed by the others, and thus the dim(W)=5 ?

while a basis is:

\[\begin{pmatrix} 0 &1 &0 \\ 0 &0 &0 \end{pmatrix},\begin{pmatrix} 1 &0 &1 \\ 0 &0 &0 \end{pmatrix},\begin{pmatrix} -1 &0 &0 \\ 1 &0 &0 \end{pmatrix},\begin{pmatrix} 0 &0 &0 \\ 0 &1 &1 \end{pmatrix},\begin{pmatrix} 1 &0 &0 \\ 0 &0 &1 \end{pmatrix}\]

As for V, if I write V as

\[\begin{pmatrix} a &a &a \\ a+b &3a+b+2c &7a+b+3c \end{pmatrix}\]

Can't I just count 3 parameters and assume dim(V)=3 ?
It is the same answer you got, but is this reasoning valid ?
 
Yankel said:
Thank you for your detailed answer, it is very helpful. I want to ask something about my view of things.

For the set W, can't I look at the condition a+e=c+f, and make it a=c+f-e, saying that only 1 out of 6 parameters can be expressed by the others, and thus the dim(W)=5 ?

while a basis is:

\[\begin{pmatrix} 0 &1 &0 \\ 0 &0 &0 \end{pmatrix},\begin{pmatrix} 1 &0 &1 \\ 0 &0 &0 \end{pmatrix},\begin{pmatrix} -1 &0 &0 \\ 1 &0 &0 \end{pmatrix},\begin{pmatrix} 0 &0 &0 \\ 0 &1 &1 \end{pmatrix},\begin{pmatrix} 1 &0 &0 \\ 0 &0 &1 \end{pmatrix}\]

As for V, if I write V as

\[\begin{pmatrix} a &a &a \\ a+b &3a+b+2c &7a+b+3c \end{pmatrix}\]

Can't I just count 3 parameters and assume dim(V)=3 ?
It is the same answer you got, but is this reasoning valid ?

Re-examine your proposed basis: the last two elements are not even members of $W$!

Yes, you can rephrase linear independence in terms of "parameters" but you must verify that these parameters lead to linear independence! To see what I mean let's look at $V$:

We have:

$\begin{bmatrix}a&a&a\\a+b&3a+b+2c&7a+b+3c \end{bmatrix} =$

$a\begin{bmatrix}1&1&1\\1&3&7 \end{bmatrix} + b\begin{bmatrix}0&0&0\\1&1&1 \end{bmatrix} + c\begin{bmatrix}0&0&0\\1&2&3 \end{bmatrix}$

Linear independence MEANS that if these 3 sum to the 0-matrix, we MUST have:

$a = b = c = 0$.

In this particular case, that leads to the four equations:

$a = 0$
$a+b = 0$
$3a+b+2c = 0$
$7a+b+3c = 0$

and it is easy to see from equation (1) that equation (2) becomes $b = 0$, and then equation (3) becomes $2c = 0$, so $c = 0$.

To see why we HAVE to check linear independence consider the following example:

$V = \text{span}\left\{\begin{bmatrix}1&1\\1&0 \end{bmatrix}, \begin{bmatrix}1&1\\0&1 \end{bmatrix} \begin{bmatrix}1&1\\2&-1 \end{bmatrix}\right\}$

Again, we can write a "typical" element of $V$ as:

$\begin{bmatrix}a+b+c&a+b+c\\a+2c&b-c \end{bmatrix}$

which has "3 parameters", but watch closely what happens when we check for linear independence:

We get the 3 equations:

$a+b+c = 0$
$a+2c = 0$
$b-c = 0$

Subtracting (3) from (1), we get:

$a + 2c = 0$, which leads to $c = \dfrac{-a}{2}$

Then equation (2) gives:

$b = \dfrac{-a}{2}$

So, choosing $a = 2$, we get:

$2\begin{bmatrix}1&1\\1&0 \end{bmatrix} - 1\begin{bmatrix}1&1\\0&1 \end{bmatrix} - 1\begin{bmatrix}1&1\\2&-1 \end{bmatrix} = \begin{bmatrix}0&0\\0&0 \end{bmatrix}$

Which means that the set is linearly DEPENDENT (we can express one of the matrices as a linear combination of the other two). SO you have to be CAREFUL when expressing things via parameters, because there may be hidden relationships that are not immediately apparent.

Yes, with $W$ it turns out that you may freely choose $a,b,c,d$ and $g$ and then $f$ is determined, but you should be able to express this fact in terms of linear independence. The easiest way to do this is choose ONE of $a,b,c,d,g$ to be 1, and all the others 0, and show that that results in a basis. But it can be done in MANY ways, because bases aren't unique.

The reason I stress this is as a sanity check: "shortcuts" can lead to mistakes. Definitions exist for a reason: not to make things HARD for you, but to keep you from making unwarranted assumptions that can lead you astray.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
Replies
5
Views
2K
Replies
31
Views
3K
  • · Replies 52 ·
2
Replies
52
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K