Linearly Independent Equations => 1 solution?

Click For Summary

Homework Help Overview

The discussion revolves around proving that three linearly independent equations in three variables yield a unique solution. Participants explore the implications of linear independence and the geometric interpretation of equations as planes in three-dimensional space.

Discussion Character

  • Conceptual clarification, Assumption checking, Mixed

Approaches and Questions Raised

  • Some participants suggest using contradiction to prove the uniqueness of the solution, while others question the geometric interpretation of planes and their intersections. There is also discussion about the applicability of Gaussian elimination in a theoretical context.

Discussion Status

The conversation is ongoing, with various interpretations and approaches being explored. Participants are questioning definitions and the implications of linear independence, as well as the limitations of methods like Gaussian elimination in a theoretical framework.

Contextual Notes

There is a recognition that the problem is theoretical, as specific equations are not provided. This leads to discussions about the nature of linear independence and the requirements for proving the existence of a unique solution.

nickadams
Messages
182
Reaction score
0

Homework Statement



Prove that if you have 3 linearly independent equations in 3 variables, then there exists only 1 solution to the system.

Homework Equations



linear independence implies none of the equations can be expressed as a linear combination of the other equations.

The Attempt at a Solution



having 3 linearly independent equations in 3 variables means that if we viewed the equations as vectors, none of the vectors would be coplanar to the plane spanned by the two other vectors. So this means any of the vectors will cross the plane spanned by the other two at only 1 point.

But how to prove this?



Thanks
 
Physics news on Phys.org
why not say each solution represents a plane and no two planes are parallel hence they intersect in only one point

but do a contradiction proof where you assume that they have more than one solution and then show that leads to a contradiction.
 
jedishrfu said:
why not say each solution represents a plane and no two planes are parallel hence they intersect in only one point
Do you mean "each equation represents a plane..."?
It's possible to have three planes with none being parallel to any of the others, yet they intersect in a line.

Here's a very simple example (each equation represents a plane in R3):
x + y = 0
2x - y = 0
x - y = 0
jedishrfu said:
but do a contradiction proof where you assume that they have more than one solution and then show that leads to a contradiction.
 
So how would you suggest to approach the problem, Mark44?

... Or anyone else?
 
Last edited:
nickadams said:
So how would you suggest to approach the problem, Mark44?

... Or anyone else?

If it were me I would just do Gaussian elimination on the system of equations.

RGV
 
Last edited:
What precisely do you mean when you say the equations are linearly independent and when you say "if we viewed the equations as vectors"?
 
Ray Vickson said:
If it were me I would just do Gaussian elimination on the system of equations.

RGV

But would that work if I wanted to prove that for a general augmented coefficient matrix in representing a system of 3 linear equations in 3 variables where the equations are not linear transformations of one another?

vela said:
What precisely do you mean when you say the equations are linearly independent and when you say "if we viewed the equations as vectors"?

When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.


Thanks
 
Ray Vickson said:
If it were me I would just do Gaussian elimination on the system of equations.

nickadams said:
But would that work if I wanted to prove that for a general augmented coefficient matrix in representing a system of 3 linear equations in 3 variables where the equations are not linear transformations of one another?
What I'm getting now is that this is more of a theoretical problem. You are not actually given the three equations - is that correct?

If so, then obviously you can't do Gaussian elimination or row reduction on the rows of your augmented matrix.

What would have to be true, though, is that if you write your system of three equations as a 3 X 4 augmented matrix, and reduce it to reduced, row-echelon form, the left part of your matrix must be the 3 X 3 identity matrix.
nickadams said:
When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.


Thanks
 
Mark44 said:
What I'm getting now is that this is more of a theoretical problem. You are not actually given the three equations - is that correct?

If so, then obviously you can't do Gaussian elimination or row reduction on the rows of your augmented matrix.

What would have to be true, though, is that if you write your system of three equations as a 3 X 4 augmented matrix, and reduce it to reduced, row-echelon form, the left part of your matrix must be the 3 X 3 identity matrix.

Of course you can "do" Gaussian elimination on a purely symbolic system. That is one of the ways in which some general linear algebra theorems are proved, or some algorithms are implemented.

RGV
 
  • #10
nickadams said:
When I say the equations are linearly independent, I mean the rows of the augmented coefficient matrix cannot be expressed as linear combinations of each other. And when I said "if we view the equations as vectors" I was just trying to get a way to visualize the rows of the coefficient matrix and why their being linearly independent would imply a single solution.
You have to be a little careful here. For example, the augmented matrix
$$\left(\begin{array}{cc|c}
1 & 0 & 1 \\
1 & 0 & 2
\end{array}\right)$$ has linearly independent rows, but there's no solution because the system is inconsistent.

What you really want is for the rows of the matrix representing the coefficients to be linearly independent.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
4
Views
2K