Pivot Variables in Reduced Row Echelon Form

In summary, when solving a system of equations using reduced row echelon form, we focus on the pivot variables because they indicate independence from the other columns in the matrix. This helps us determine the number of independent and dependent variables in the system and find the real dimension of the space. Inconsistent systems may occur if we are not careful, but the pivot variables can still be chosen for consistency.
  • #1
vanmaiden
102
1
Why do we solve for only the pivot variables when we are trying to solve a system of equations using reduced row echelon form?

Thank you
 
Physics news on Phys.org
  • #2
vanmaiden said:
Why do we solve for only the pivot variables when we are trying to solve a system of equations using reduced row echelon form?

Thank you

Hey vanmaiden.

The simple explanation is that the pivot variables end up telling us what is independent and what is not.

When we get a pivot variable, we are essentially finding a column that has independence from the other columns in the matrix. If this happens in a square matrix and there is a pivot in every column, then we know that the matrix is invertible and for a system Ax = b, this means that x has a unique solution.

If this is not the case (pivot in every column) it means that there are dependent variables somewhere. When this is the case for a square matrix, it means that there are infinitely many solutions for a particular Ax=b, or possibly even no solutions at all for an inconsistent system depending on what the b vector is.

If we however take a general matrix and reduce it and find that it has a pivot count less than or equal to the number of rows, then it means that the vectors are linearly dependent and the basis for the space can be written in terms of a lesser number of vectors.

This is important when we want to figure out the dimension of a system, and it's used in many ways including in Rank-Nullity, and describing how to actually construct a proper basis from a set of vectors (and find the real dimension of the space).
 
  • #3
chiro said:
Hey vanmaiden.

The simple explanation is that the pivot variables end up telling us what is independent and what is not.

If we however take a general matrix and reduce it and find that it has a pivot count less than or equal to the number of rows, then it means that the vectors are linearly dependent and the basis for the space can be written in terms of a lesser number of vectors.

I'm still somewhat confused, but not as much as I was before. I can now see the presence of standard basis vectors when we put a matrix into reduced row echelon form, but I don't see how their independence from the rest of the vectors make their variables the ones that we solve for. I guess what I'm trying to say is I see the word "independence," but I don't know how to connect the independence of a vector to the independence of a variable.
 
  • #4
vanmaiden said:
I'm still somewhat confused, but not as much as I was before. I can now see the presence of standard basis vectors when we put a matrix into reduced row echelon form, but I don't see how their independence from the rest of the vectors make their variables the ones that we solve for. I guess what I'm trying to say is I see the word "independence," but I don't know how to connect the independence of a vector to the independence of a variable.

Well in terms of independence, I refer to linear independence. Two things are linearly independent if one can not be expressed as a linear combination of other things. When they are linearly dependent, we can express something in terms of a linear combination of the other things.

Now in terms of the pivot variables, when we have a situation where there are less pivots than columns, we get the situation where at least for that system of equations, we have a situation where the other variables are linearly dependent on the other variables.

As an example, let's look at a few simple systems in REF:

[1 2 3 4] [0]
[2 4 6 8] [0] which reduces to

[1 2 3 4] [0]
[0 0 0 0] [0]

This translates into x + 2y + 3z + 4w = 0. This means that we pick one independent scalar variable and three dependent scalar variables. For this system, everything is linearly dependent on the vector (1,2,3,4) or any multiple thereof. Now let's look at the next example:

[1 2 3 4] [0]
[2 4 6 7] [0] which reduces to

[1 2 3 0] [0]
[0 0 0 1] [0]

This means we have two linearly independent vectors (1,2,3,0) and (0,0,0,1) with two pivots, but in terms of the linearly dependent variables inside the system we have for our first vector one independent scalar variable and two dependent variables (since x + 2y + 3z = 0, pick one independent, the others are dependent).

So in the first one we had one pivot with only one independent vector, the second had two pivots with two independent vectors, and the pattern does continue from there.

If we have situations where we get pivots in every column (even if there are more rows than columns) we should have a full set of independent vectors equal in number to the number of columns and in a square matrix, this is a basis.

The thing is that in the rows > columns scenario, you need to take into account inconsistent systems because in general, they will occur if you are not careful.
 
  • #5
chiro said:
This translates into x + 2y + 3z + 4w = 0. This means that we pick one independent scalar variable and three dependent scalar variables.

since x + 2y + 3z = 0, pick one independent, the others are dependent.

Do we pick the pivot variable (in that case, "x") because it seems the easiest to use when finding the solution due to its independence from the rest of the column vectors?
 
  • #6
vanmaiden said:
Do we pick the pivot variable (in that case, "x") because it seems the easiest to use when finding the solution due to its independence from the rest of the column vectors?

You can pick the pivot variable if you want to, it doesn't really matter but it would make sense at least for consistency.

The main point is to understand how a system with a certain number of pivot points with respect to the number of columns in the matrix affects the number of independent and dependent variables as well as subsequently the number of independent vectors, which is probably the more important aspect.

Knowing these gives you the reduction of the entire system, and it tells you indirectly (if you wish to know) how you can take a vector if it's completely zeroed out and represent it in terms of a linear combination of the other vectors that aren't zeroed out.

The ones that get zeroed out (all zeroes) are vectors that are a linear combination of the ones that haven't been zeroed out.

If you only get one non-zero vector it means that everything is just a multiple of that vector.

So for solving systems, if you have linearly dependent vectors in your system, you can get the independent ones and then if your system is still consistent, figure out how to represent by a parameter the infinite number of solutions because geometrically what is happening is that instead of everything meeting at a point (when you get a unique solution), something is parallel somewhere meaning that you get the infinite number of solutions.
 
  • #7
chiro said:
You can pick the pivot variable if you want to, it doesn't really matter but it would make sense at least for consistency.

I just find it somewhat strange that you can choose which variable is the independent variable and which one is the dependent variable after putting a matrix in RREF. I mean it seems like such a choice would be more defined and rigid rather than at one's whim. Am I viewing this correctly?
 
  • #8
vanmaiden said:
I just find it somewhat strange that you can choose which variable is the independent variable and which one is the dependent variable after putting a matrix in RREF. I mean it seems like such a choice would be more defined and rigid rather than at one's whim. Am I viewing this correctly?

Sorry, when I said dependent variables I meant like in the situation where we had x + y = 0 where one variable is independent and the other is dependent.

I didn't mean about the vectors themselves in the RREF which are either independent or dependent. Again sorry for the confusion.
 

What are pivot variables in reduced row echelon form?

Pivot variables in reduced row echelon form refer to the columns of a matrix that contain the leading entry (a non-zero number) in each row. These variables represent the basic variables in a system of linear equations and can be used to find a unique solution.

How do you identify pivot variables in reduced row echelon form?

To identify pivot variables, look for the columns with the leading entry in each row. These columns will contain the pivot variables. In reduced row echelon form, these variables will have a value of 1, while all other entries in the column will be 0.

Why are pivot variables important in reduced row echelon form?

Pivot variables are important because they represent the basic variables in a system of linear equations. They provide a way to find a unique solution and can help to simplify the solution process.

How do you use pivot variables to solve a system of linear equations?

To solve a system of linear equations, use the pivot variables to set up a back substitution process. Start with the bottom row and use the value of the pivot variable to eliminate the other variables in that row. Then, move up to the next row and use the values of the pivot variables in that row to eliminate the remaining variables. Continue this process until you reach the top row, where you will have a unique solution.

Can a system of linear equations have more than one pivot variable?

No, a system of linear equations can only have one pivot variable per row. This is because the leading entry in each row must be a unique and non-zero number. If there were more than one pivot variable in a row, it would indicate that the system is inconsistent and has no solution.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
849
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
911
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
947
Replies
3
Views
991
  • Linear and Abstract Algebra
Replies
26
Views
4K
  • Linear and Abstract Algebra
Replies
13
Views
1K
Back
Top