# Prove <h,k> is in span{a,b} for all h,k

## Homework Statement

If ${\bf a} = \begin{smallmatrix} 2 \\ -1 \end{smallmatrix}$ and ${\bf b} = \begin{smallmatrix} 2 \\ 1 \end{smallmatrix}$, and ${\bf c} = \begin{smallmatrix} h \\ k \end{smallmatrix}$, show that ${\bf c} \in span \left\{ {\bf a},{\bf b} \right\} \forall h,k$

## The Attempt at a Solution

Since this means that c is a linear combination of a and b, I used the definition to create an augmented matrix:
$\begin{smallmatrix} 2 & 2 & h \\ -1 & 1 & k\\ \end{smallmatrix}$

I did most of the rref on that, until I just decided to take a system of equations:
$$2x_{1} = h + \frac{-1}{2}(2k+h)\\ 2x_{2} = \frac{2k+h}{2}$$

But I have no idea what to do here. It's clear to me that span(a,b) is an infinite plane, but I don't know how to show that any vector (in this case <h,k>) that exists in R^2 will be contained on that plane.

I know it's a stupid question, sorry. This is also the first time I have attempted to latex a matrix...

Related Calculus and Beyond Homework Help News on Phys.org
Mark44
Mentor

## Homework Statement

If ${\bf a} = \begin{smallmatrix} 2 \\ -1 \end{smallmatrix}$ and ${\bf b} = \begin{smallmatrix} 2 \\ 1 \end{smallmatrix}$, and ${\bf c} = \begin{smallmatrix} h \\ k \end{smallmatrix}$, show that ${\bf c} \in span \left\{ {\bf a},{\bf b} \right\} \forall h,k$

## The Attempt at a Solution

Since this means that c is a linear combination of a and b, I used the definition to create an augmented matrix:
$\begin{smallmatrix} 2 & 2 & h \\ -1 & 1 & k\\ \end{smallmatrix}$

I did most of the rref on that, until I just decided to take a system of equations:
$$2x_{1} = h + \frac{-1}{2}(2k+h)\\ 2x_{2} = \frac{2k+h}{2}$$

But I have no idea what to do here. It's clear to me that span(a,b) is an infinite plane, but I don't know how to show that any vector (in this case <h,k>) that exists in R^2 will be contained on that plane.
Vectors a and b are linearly independent, so span R2. Since there are two of them, they form a basis for R2 That means that any vector in R2 is some linear combination of a and b.

In your work with the augmented matrix, as long as the left submatrix has no row of zeroes, that means that you can find the coordinates of a and b that yield c.

When you're dealing with two vectors, it's very easy to see whether they're linearly independent. As long as neither is a multiple of the other (which also precludes the possibility that either is the 0 vector), they're linearly independent.
I know it's a stupid question, sorry. This is also the first time I have attempted to latex a matrix...

Mark44
Mentor
Here's some simple LaTeX for a 2 x 2 matrix.
\ begin{bmatrix} 1 & 2 \\ 3 & 4 \ end{bmatrix}

Remove the extra spaces before begin and end to have it render.

Here's how it renders:
$$\begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$$

Thank you.

I understand that it is true, because
$$a \not= cb$$
where b is some real number. How do I "show" that? The book gave a hint, that is, to use the augmented matrix, but I don't see how that's helping.

Mark44
Mentor
I'm not sure it deserves anything very involved - the two vectors obviously point in different directions. You can see by inspection that a is not a multiple of b. As a calculus instructor I had many years ago used to say, "It's obvious to the most casual observer."
He would also say stuff like, "even my own mother could integrate that."

If you feel the need to do some actual "math" show that the cosine of the angle between the vectors is not 1 or -1 (corresponding to angles of 0° or 180°).

$$cos(θ) = \frac{a \cdot b}{|a||b|}$$

I'm not sure it deserves anything very involved - the two vectors obviously point in different directions. You can see by inspection that a is not a multiple of b. As a calculus instructor I had many years ago used to say, "It's obvious to the most casual observer."
He would also say stuff like, "even my own mother could integrate that."

If you feel the need to do some actual "math" show that the cosine of the angle between the vectors is not 1 or -1 (corresponding to angles of 0° or 180°).

$$cos(θ) = \frac{a \cdot b}{|a||b|}$$
That's what I'm looking for! Thanks!