Linearly Independent Equations => 1 solution?

Click For Summary
SUMMARY

This discussion centers on proving that three linearly independent equations in three variables yield a unique solution. Participants emphasize that linear independence means no equation can be expressed as a linear combination of the others, leading to the conclusion that the corresponding planes intersect at a single point. Gaussian elimination is suggested as a method for demonstrating this property, particularly when analyzing the augmented coefficient matrix. The consensus is that the left part of the reduced row-echelon form of this matrix must be the 3x3 identity matrix to confirm the existence of a unique solution.

PREREQUISITES
  • Understanding of linear independence in vector spaces
  • Familiarity with augmented matrices and row-echelon form
  • Knowledge of Gaussian elimination techniques
  • Basic concepts of linear equations and their geometric interpretations
NEXT STEPS
  • Study the process of Gaussian elimination on augmented matrices
  • Explore the geometric interpretation of linear equations in R3
  • Learn about the implications of linear independence in higher dimensions
  • Investigate the conditions for consistency in systems of linear equations
USEFUL FOR

Students and educators in linear algebra, mathematicians interested in theoretical proofs, and anyone seeking to understand the geometric implications of linear independence in systems of equations.

nickadams
Messages
182
Reaction score
0

Homework Statement



Prove that if you have 3 linearly independent equations in 3 variables, then there exists only 1 solution to the system.

Homework Equations



linear independence implies none of the equations can be expressed as a linear combination of the other equations.

The Attempt at a Solution



having 3 linearly independent equations in 3 variables means that if we viewed the equations as vectors, none of the vectors would be coplanar to the plane spanned by the two other vectors. So this means any of the vectors will cross the plane spanned by the other two at only 1 point.

But how to prove this?



Thanks
 
Physics news on Phys.org
why not say each solution represents a plane and no two planes are parallel hence they intersect in only one point

but do a contradiction proof where you assume that they have more than one solution and then show that leads to a contradiction.
 
jedishrfu said:
why not say each solution represents a plane and no two planes are parallel hence they intersect in only one point
Do you mean "each equation represents a plane..."?
It's possible to have three planes with none being parallel to any of the others, yet they intersect in a line.

Here's a very simple example (each equation represents a plane in R3):
x + y = 0
2x - y = 0
x - y = 0
jedishrfu said:
but do a contradiction proof where you assume that they have more than one solution and then show that leads to a contradiction.
 
So how would you suggest to approach the problem, Mark44?

... Or anyone else?
 
Last edited:
nickadams said:
So how would you suggest to approach the problem, Mark44?

... Or anyone else?

If it were me I would just do Gaussian elimination on the system of equations.

RGV
 
Last edited:
What precisely do you mean when you say the equations are linearly independent and when you say "if we viewed the equations as vectors"?
 
Ray Vickson said:
If it were me I would just do Gaussian elimination on the system of equations.

RGV

But would that work if I wanted to prove that for a general augmented coefficient matrix in representing a system of 3 linear equations in 3 variables where the equations are not linear transformations of one another?

vela said:
What precisely do you mean when you say the equations are linearly independent and when you say "if we viewed the equations as vectors"?

When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.


Thanks
 
Ray Vickson said:
If it were me I would just do Gaussian elimination on the system of equations.

nickadams said:
But would that work if I wanted to prove that for a general augmented coefficient matrix in representing a system of 3 linear equations in 3 variables where the equations are not linear transformations of one another?
What I'm getting now is that this is more of a theoretical problem. You are not actually given the three equations - is that correct?

If so, then obviously you can't do Gaussian elimination or row reduction on the rows of your augmented matrix.

What would have to be true, though, is that if you write your system of three equations as a 3 X 4 augmented matrix, and reduce it to reduced, row-echelon form, the left part of your matrix must be the 3 X 3 identity matrix.
nickadams said:
When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.


Thanks
 
Mark44 said:
What I'm getting now is that this is more of a theoretical problem. You are not actually given the three equations - is that correct?

If so, then obviously you can't do Gaussian elimination or row reduction on the rows of your augmented matrix.

What would have to be true, though, is that if you write your system of three equations as a 3 X 4 augmented matrix, and reduce it to reduced, row-echelon form, the left part of your matrix must be the 3 X 3 identity matrix.

Of course you can "do" Gaussian elimination on a purely symbolic system. That is one of the ways in which some general linear algebra theorems are proved, or some algorithms are implemented.

RGV
 
  • #10
nickadams said:
When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.
You have to be a little careful here. For example, the augmented matrix
$$\left(\begin{array}{cc|c}
1 & 0 & 1 \\
1 & 0 & 2
\end{array}\right)$$ has linearly independent rows, but there's no solution because the system is inconsistent.

What you really want is for the rows of the matrix representing the coefficients to be linearly independent.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
1K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
4
Views
1K