# Find a basis for R2 that includes the vector (1,2)

1. Nov 12, 2017

• Member warned that the homework template is not optional
My answer was [ (1,2), (0,-1) ]

I just wanted to make sure with you guys that you could essentially have an infinite amount of answers.

For example:

[ (1,2), (0,1) ]

[ (1,2), (0, -5) ]

etc would all suffice, right?

2. Nov 12, 2017

### Orodruin

Staff Emeritus
As long as you do not put further requirements on the basis (such as orthogonality), yes.

3. Nov 12, 2017

Thank you for the quick reply.

4. Nov 25, 2017

### scottdave

As long as the other vector is nonzero and not a scalar multiple of (1,2) { examples would be (-1,-2) or (3,6) } then you will have two linear independent vectors which can be used to describe any point in R2.

5. Nov 25, 2017

### WWGD

Actually EDIT any non-sero vector orthogonal to a vector v is linearly-independent to it/ with it.

Last edited: Nov 25, 2017
6. Nov 25, 2017

### Staff: Mentor

You might want to restrict "any vector" a bit. The zero vector is orthogonal to every other vector in whatever is the space of interest, but the zero vector can't be among a set of linearly independent vectors.

7. Nov 25, 2017

### Orodruin

Staff Emeritus
The OP clearly had the right idea from the beginning and marked the thread resolved two weeks ago ...

While true after the edit, nothing in the OP’s question requires orthogonality. It is also slightly misleading as it can be read to imply that orthogonality is a requirement for for linear independence - it is not. The OP had the right idea from the beginning.

8. Nov 25, 2017

### WWGD

But orthogonality guarantees L.I, and this can work for higher dimensions, i.e., it generalizes, which may not happen when you try to make a guess for a higher number of dimensions: Can you extend, e.g. $\{ (1,0,1,0), (0,2,3,1) \}$ into a basis for $\mathbb R^4$ without using orthogonality? Besides, I stated orthogonality as a sufficient condition and I cannot be held responsible if someone takes it as a necessary one. EDIT: For a more formal treatment of the result: https://en.wikipedia.org/wiki/Fundamental_theorem_of_linear_algebra

9. Nov 26, 2017

### Orodruin

Staff Emeritus
Yes, I can do that essentially without thinking or computing anything. The appropriate check is that the determinant of the matrix with the components of vector k in row k is non-zero.

10. Nov 26, 2017

### WWGD

EDIT: And how do you decide whether the determinant is non-zero without thinking or computing?
Is that a significant improvement ( computationally or otherwise) over making sure you choose a pair of vectors in $\{(1,0,1,0), (0,2,3,1) \} ^{\perp}$?
I just don't see the reason for opposing my suggested approach.

11. Nov 26, 2017

### Orodruin

Staff Emeritus
Well, apart from the fact that the OP had already solved this problem two weeks ago and has not been online for a week, it is cumbersome and ineffective if all you are looking for is a basis. You can simply take two linearly independent vectors that are obviously orthogonal to a subspace that the projection of your given vectors onto form a basis in. In the case of your example, (0,0,1,0) and (0,0,0,1) will do fine. These vectors span the subspace of vectors on the form (0,0,z,w), the projection of your vectors onto the orthogonal complement of that subspace clearly form a basis of it. Done.

If you would want to check by computing the determinant, it would be 2.

12. Nov 26, 2017

### WWGD

Well, first of all, these posts are also intended to be a repository of knowledge so that anyone reading may learn something , and not just immediate answers for the OP. Still, If I understood correctly, you are ultimately using ortho projections, so what is the difference then? And the computation of the determinant may be obvious and clear to you but not necessarily so for anyone/everyone. Ditto for the "obviously orthogonal" . Ultimately, my method guarantees the existence and a specific method, and not just heuristics. Essentially, the obviousness you believe intrinsic in your approach may not be so for any/every user. It is fine to state and believe your approach is better , but to claim there is some objectivity to it, would require, IMHO, stronger arguments/support.
EDIT: At any rate, you can have the last word if you wish to, I am not interested in pursuing this any further.

13. Nov 26, 2017

### Orodruin

Staff Emeritus
I cannot see that you have given a method, merely a statement that you want the additional vectors to be orthogonal. That if anything is heuristic. Besides, there is no guarantee that your method works in a general linear vector space where the existence of a metric is not guaranteed (take the space of polynomials of degree three and lower as an example). Simply selecting vectors outside the subspace spanned by the given vectors does work regardless of the vector space. I just do not see any reason to bring up looking for an orthogonal basis here. I find it much simpler to just take the vectors (1,0,0,0), (0,1,0,0), (0,0,1,0), and (0,0,0,1) - then you include them from the left (or right) while discarding any combination that is linearly dependent on the ones you have already included and the original ones.

14. Nov 26, 2017

### WWGD

And you want the vectors to not be in the span, and then you make reference to ortho projection as means to verifying the correctness of your choice, while decrying the lack of generality of my use of metrics, while making reference to orthogonality as a means to generating said basis . What gives then? Do you choose to use orthogonality or not? EDIT: Yes, I did assume we would be working within $\mathbb R^n$ , and, yes, my method does assume an inner-product, which does restrict the generality.
The value of the determinant being 2 is not obvious to me. Maybe I am missing something obvious, not sure. And good luck figuring out when the determinant of a collection of abstract objects: polynomials, etc. is non-zero

15. Nov 26, 2017

### Orodruin

Staff Emeritus
The orthogonal complement in my approach is not a necessity. It is sufficient to pick a basis for any subspace such that no non-zero vector in it can be spanned by the given vectors.

16. Nov 26, 2017

### WWGD

But how can you tell the vector is not in the span? Of course, if a vector is not in the span, it is linearly independent from the basis vectors, but, how do you choose such vector in a more-or-less algorithmic way, e.g., with polynomials or other non-numerical objects, EDIT : or even within $\mathbb R^n$ for n large-enough, other than by finding the determinant, which again, may be difficult to use with non-numerical "vectors", or may become unwieldy for a large value of $n$? EDIT 2I don't see how to extend your idea of using $e_i$ an removing LD vectors to general abstract spaces.

17. Nov 26, 2017

### Orodruin

Staff Emeritus
You check whether or not it can be written as a linear combination. This is generally not a difficult task which typically can be accomplished by showing that no non-trivial linear combination of all basis vectors can be zero. This will generally boil down to the determinant condition given some other basis. Consider the polynomials $p_1 = x^2 + 2x - 1$ and $p_2 = x^2 + x + 3$. Now take the subspace spanned by the monomial $x^2$. To write $x^2$ as a linear combination $a p_1 + b p_2$ would require $a + b = 1$ and therefore
$$2a + b = 0, \quad -a + 3b = 0,$$
is impossible to satisfy and you are done. Had you not been done, you could have continued with the monomial $x$ and ultimately with the monomial $x^0$.