Verifying if Vector b is in Span of Vectors v1 and v2

charmedbeauty
Messages
266
Reaction score
0

Homework Statement



is vector b in the span of vectors v1,v2? Give reasons.



Homework Equations





The Attempt at a Solution



v1=(1,4,0,-1)

v2=(2,7,-2,-3)

b= (-1,-1,7,4)

set up in matrix

(v1,v2| b)

and after row reduction I have

(1,0,0,0),(2,-1,0,0)|(-1,3,6,6)

matrix has no sltn

since 0x + 0y = 6

therefore b does not span v1 ,v2.

my row reduction steps are..

R4= R4+R1
R2=R2-4R1


R4=R4+R2
R3=R3+R1


...


is this right?
 
Physics news on Phys.org
The general result (negative) is correct. However, I get a different result numerically. Specifically, R3 = R3 + R1 seems wrong because that gives you 1 n the first column of R3, which you do not want.
 
voko said:
The general result (negative) is correct. However, I get a different result numerically. Specifically, R3 = R3 + R1 seems wrong because that gives you 1 n the first column of R3, which you do not want.

ohh thanks I did not see that.
 
A more fundamental resolution is that a vector, b, is in the span of vectors v1 and v2 if and only if there exist scalars, a and b, such that b= av1+ bv2 (that's the definition of 'span').

That is, there must exist a and b such that a(1, 4, 0, -1)+ b(2, 7, -2, -3)= (a+ 2b, 4a+ 7b, -2b, -a- 3b)= (-1, -1, 7, 4) so that we must have
a+ 2b= -1
4a+7b= -1
-2b= 7
-a- 3b= 4.

Of course, what you are doing is using the 'augented' matrix to try to solve those equations. But in this simple case, you can recognize that the third equation, -2b= 7 gives b= -7/2 and so the fourth equation becoms -a+ 21/2= 4 so that a= 21/2- 4= (21- 8)/2= 13/2. Now put those values of x and y into the first and second equations to see if they satisfy them: a+ 2b= -1 becomes 13/2+ (-14/2)= -1/2, NOT -1 so we can stop.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top