Testing for linear combinations using matrices instead of vectors

Click For Summary
To determine if the matrix w = (1,0;0,1) is a linear combination of the matrices v1 = (1,2;-2,1) and v2 = (3,2;-1,1), one can treat the matrices as vectors by reshaping them into a vector format. This allows the use of familiar methods for solving linear combinations, such as setting up an augmented matrix with v1 and v2 as columns alongside w. The goal is to find coefficients that satisfy the equation w = c1*v1 + c2*v2. Additionally, when determining how w can be expressed as a combination of multiple vectors, one can create a system of equations from the augmented matrix and solve for the coefficients. This approach simplifies the process of finding linear combinations in matrix form.
mitch_1211
Messages
95
Reaction score
1
I want to see if the matrix w = (1,0;0,1) is a linear combination of the matrices
v1 = (1,2;-2,1) and v2 = (3,2;-1,1) where ; denotes a new line in the matrix.

I know for example if w and v were 1xn matrices i.e vectors such as w = [1,1,1]
v1= [2,-1,3] v2=[1,1,2] then i setup a matrix with v1 and v2 as the columns and augment that with the vector w. Then row reduce to see if there is a solution. If there is a solution then w is a linear combination of v1 and v2

I don't know where to start in terms of putting the matrices v1 and v2 together and augmenting with w in order to get a system to solve..

Maybe I am going about it the wrong way? Is there a different method rather than using
w = c1v1 + c2v2 +c3v3 ... +cnvn and setting up a matrix?

Any help much appreciated

Mitch
 
Physics news on Phys.org
Hi mitch_1211! :smile:

The thing is, at this level there is no real difference between matrices and vectors. It's only the fact that matrices have a multiplication that makes matrices different from vectors.
What do I mean with this? Well, there is an 1-1 correspondance

\left(\begin{array}{cc} a & b\\ c & d\end{array}\right)\leftrightarrow (a,b,c,d)

which preserves addition and scalar multiplication. Thus for what you want to do, you can treat your matrices as vectors. For example, the matrix

\left(\begin{array}{cc} 1 & 0\\ 0 & 1\end{array}\right)

will correspond to the vector (1,0,0,1) and so on. And once you translated the matrix problem into a vector problem, you can apply all the methods your familiar with!

NOTE: maybe you'll want more rigourous language then what I described here. The point is that

\phi:M_2(\mathbb{R})\rightarrow \mathbb{R}^4:\left(\begin{array}{cc} a & b\\ c & d\end{array}\right)\rightarrow (a,b,c,d)

is an isomorphism of vector spaces. Thus linear dependence in M_2(\mathbb{R}) (= the 2x2-matrices) can be completely translated to linear dependence in \mathbb{R}^2.
 
micromass said:
The thing is, at this level there is no real difference between matrices and vectors.

Thank you for getting back to me so quickly! I was thinking that i could probably do that, but I wasn't 100% sure the vectors wouldn't get screwed up. That makes it very simple now!

Also I had one more question, when i create a matrix A whose columns are vectors that span a vector space, i know when i reduce this matrix the columns with the leading ones will correspond to the original vectors in A that form a basis for the vector space.

I remember (i think!) that there was a similar way to find out how a vector is a linear combinations of other vectors. i.e if i have found that w is a combination of v1 v2 v3 v4 is there a way, similar to the basis method, that will tell me how w can be written in terms of the vectors v1 v2 v3 v4?
 
mitch_1211 said:
Thank you for getting back to me so quickly! I was thinking that i could probably do that, but I wasn't 100% sure the vectors wouldn't get screwed up. That makes it very simple now!

Also I had one more question, when i create a matrix A whose columns are vectors that span a vector space, i know when i reduce this matrix the columns with the leading ones will correspond to the original vectors in A that form a basis for the vector space.

I remember (i think!) that there was a similar way to find out how a vector is a linear combinations of other vectors. i.e if i have found that w is a combination of v1 v2 v3 v4 is there a way, similar to the basis method, that will tell me how w can be written in terms of the vectors v1 v2 v3 v4?

Well, you'll need to find a,b,c,d such that

w=av_1+bv_2+cv_3+dv_4

this corresponds to a system of equations which can be put into a system:

(v_1~v_2~v_3~v_4~|~w)

if you reduce this matrix and then transform it back into a system and solve it, you'll see the values of a,b,c and d.

Maybe a small example. Take w=(1,3), v1=(1,4), v2=(0,5). We need to find a and b such that

(1,3)=a(1,4)+b(3,5)

This corresponds to a system

\left\{\begin{array}{c}a+3b=1\\ 4a+5b=3\end{array}\right.

Writing this system as a matrix, we get

\left(\begin{array}{cc|c} 1 & 3 & 1\\ 4 & 5 & 3\\ \end{array}\right)

(we see that this matrix has the form (v_1~v_2~|~w). Reducing this matrix and solving the associated system gives you values for a and b.
 
micromass said:
Reducing this matrix and solving the associated system gives you values for a and b.

Of course. How could i have not realized that! The whole point of setting up the (v1,v2..vn|w) system is to solve for the coefficients. I have been to wrapped up in the simple fact of "is there a solution or not" that it totally slipped my mind that the solutions are actually the coefficients!

Thank you so much for taking the time to explain all this to me, much appreciated.

Mitch
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
8
Views
15K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
12
Views
9K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 84 ·
3
Replies
84
Views
10K