# Homework Help: Linearly Independent Equations => 1 solution?

1. Oct 3, 2012

1. The problem statement, all variables and given/known data

Prove that if you have 3 linearly independent equations in 3 variables, then there exists only 1 solution to the system.

2. Relevant equations

linear independence implies none of the equations can be expressed as a linear combination of the other equations.

3. The attempt at a solution

having 3 linearly independent equations in 3 variables means that if we viewed the equations as vectors, none of the vectors would be coplanar to the plane spanned by the two other vectors. So this means any of the vectors will cross the plane spanned by the other two at only 1 point.

But how to prove this?

Thanks

2. Oct 3, 2012

### Staff: Mentor

why not say each solution represents a plane and no two planes are parallel hence they intersect in only one point

but do a contradiction proof where you assume that they have more than one solution and then show that leads to a contradiction.

3. Oct 3, 2012

### Staff: Mentor

Do you mean "each equation represents a plane..."?
It's possible to have three planes with none being parallel to any of the others, yet they intersect in a line.

Here's a very simple example (each equation represents a plane in R3):
x + y = 0
2x - y = 0
x - y = 0

4. Oct 4, 2012

So how would you suggest to approach the problem, Mark44?

... Or anyone else?

Last edited: Oct 4, 2012
5. Oct 4, 2012

### Ray Vickson

If it were me I would just do Gaussian elimination on the system of equations.

RGV

Last edited: Oct 4, 2012
6. Oct 4, 2012

### vela

Staff Emeritus
What precisely do you mean when you say the equations are linearly independent and when you say "if we viewed the equations as vectors"?

7. Oct 4, 2012

But would that work if I wanted to prove that for a general augmented coefficient matrix in representing a system of 3 linear equations in 3 variables where the equations are not linear transformations of one another?

When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.

Thanks

8. Oct 4, 2012

### Staff: Mentor

What I'm getting now is that this is more of a theoretical problem. You are not actually given the three equations - is that correct?

If so, then obviously you can't do Gaussian elimination or row reduction on the rows of your augmented matrix.

What would have to be true, though, is that if you write your system of three equations as a 3 X 4 augmented matrix, and reduce it to reduced, row-echelon form, the left part of your matrix must be the 3 X 3 identity matrix.

9. Oct 4, 2012

### Ray Vickson

Of course you can "do" Gaussian elimination on a purely symbolic system. That is one of the ways in which some general linear algebra theorems are proved, or some algorithms are implemented.

RGV

10. Oct 4, 2012

### vela

Staff Emeritus
You have to be a little careful here. For example, the augmented matrix
$$\left(\begin{array}{cc|c} 1 & 0 & 1 \\ 1 & 0 & 2 \end{array}\right)$$ has linearly independent rows, but there's no solution because the system is inconsistent.

What you really want is for the rows of the matrix representing the coefficients to be linearly independent.