High School Can elementary matrix operations change the solutions to a system of equations?

Click For Summary
SUMMARY

The discussion centers on the impact of elementary matrix operations on the solutions of a system of equations. Specifically, it explains that operations such as replacing a row with the sum of itself and a constant multiple of another row do not alter the solution set of the system. This is because these operations maintain the equivalence of the equations represented by the matrix. The conversation highlights the importance of understanding row operations in Gaussian elimination and their role in achieving row echelon form without changing the underlying solutions.

PREREQUISITES
  • Understanding of matrix representation of systems of equations
  • Familiarity with Gaussian elimination and row echelon form
  • Knowledge of elementary row operations in matrix algebra
  • Basic algebraic manipulation of equations
NEXT STEPS
  • Study the properties of elementary row operations in linear algebra
  • Learn about the implications of matrix multiplication on solution sets
  • Explore the concept of invertible matrices and their role in transformations
  • Practice solving systems of equations using Gaussian elimination with various examples
USEFUL FOR

Students of linear algebra, educators teaching matrix operations, and anyone looking to deepen their understanding of systems of equations and their solutions through matrix methods.

opus
Gold Member
Messages
717
Reaction score
131
To my understanding, a matrix is just a way of representing a system of equations in an organized format.
So for example, if we have some system of equations, we can get them into standard form, and translate them into what's known as an augmented matrix. This is similar to using synthetic division for dividing polynomials.

Now one rule for elementary matrix operations is that:
Some row, ##j##, can be replace with the sum of itself and a constant multiple of row ##i##- denoted as ##(cRi+Rj)##.

Now my questions is, why doesn't this change the solutions to the system of equations?

Take for example the matrix in my attached picture. We are multiplying ##R_1## by -5, and then adding that result to ##R_2##. Rows 1 and 3 remain the same as when we started, but Row 2 has changed in a way that is not a multiple of itself in the given matrix. I understand that this process is to get the matrix into row echelon form so that we can perform Gaussian Elimination, but I don't understand why our solutions haven't changed now that we have a different multiple of Row 2.
 

Attachments

  • Screen Shot 2018-07-24 at 4.15.16 PM.png
    Screen Shot 2018-07-24 at 4.15.16 PM.png
    13.2 KB · Views: 709
Mathematics news on Phys.org
opus said:
To my understanding, a matrix is just a way of representing a system of equations in an organized format.
Yes, as it can be seen as such, no, as it is not the only way to view what's going on.
So for example, if we have some system of equations, we can get them into standard form, and translate them into what's known as an augmented matrix. This is similar to using synthetic division for dividing polynomials.

Now one rule for elementary matrix operations is that:
Some row, ##j##, can be replace with the sum of itself and a constant multiple of row ##i##- denoted as ##(cRi+Rj)##.

Now my questions is, why doesn't this change the solutions to the system of equations?
If the ##i-##th row is the equation ##f_i(x_1,\ldots , x_n)=b_i## and likewise ##f_j##, can you show, that a certain set of numbers ##a_k## for the ##x_k## which satisfies ##f_i(a_k)=b_i## and ##f_j(a_k)=b_j## also satisfies ##(\alpha f_i + \beta f_j)(a_k) = \alpha f_i(a_k) + \beta f_j(a_k)= \alpha b_i + \beta b_j\,?##
Take for example the matrix in my attached picture. We are multiplying ##R_1## by -5, and then adding that result to ##R_2##. Rows 1 and 3 remain the same as when we started, but Row 2 has changed in a way that is not a multiple of itself in the given matrix. I understand that this process is to get the matrix into row echelon form so that we can perform Gaussian Elimination, but I don't understand why our solutions haven't changed now that we have a different multiple of Row 2.
 
  • Like
Likes opus
fresh_42 said:
If the i−i-th row is the equation fi(x1,…,xn)=bif_i(x_1,\ldots , x_n)=b_i and likewise fjf_j, can you show, that a certain set of numbers aka_k for the xkx_k which satisfies fi(ak)=bif_i(a_k)=b_i and fj(ak)=bjf_j(a_k)=b_j

This would just be the intersection of ##f_i## and ##f_j##. So ##a_k## could be said to be something like ##(x,y,z)##

fresh_42 said:
also satisfies (αfi+βfj)(ak)=αfi(ak)+βfj(ak)=αbi+βbj?

A little lost on this part.
It looks like you're adding the solutions of each equation together?
 
opus said:
Now my questions is, why doesn't this change the solutions to the system of equations?
Because you're adding equal quantities to both sides of the equation you modify.

Here's a simple example, using a system of equations:
##2x + y = 5##
##x + y = 3##

If I replace the first equation by itself plus -2 times the second equation, I get a new system:
##0x - y = -1##
##x + y = 3##

Note that the zero coefficient in the first equation isn't necessary.

From the first equation, it is easy to see that y = 1. By back-substituting this value into the second equation, it's also easy to see that x = 2, so the solution to the system is the point (2, 1). Notice that this pair of numbers satisfies the first system and the altered second system.

As an alternative, I could replace the second equation by itself plus the first equation, obtaining the system
##0x - y = -1##
##x + 0y = 2##
So again, the solution is the point (2, 1).

Working with matrices, you're doing the same sorts of operations, but don't have to keep track of the variables. That's really the only difference.

Keep in mind there are three row operations that result in an equivalent system (i.e., one with the same solution(s)).
  1. Swapping two rows: ##R_i## <--> ##R_j##
  2. Replacing a row by a nonzero multiple of itself: ##R_i## <-- ##kR_i##
  3. Replacing a row by itself plus a nonzero multiple of another row: ##R_i## <-- ##R_i + k R_j##
 
  • Like
Likes opus
Let's make an example. Say we have the equations ##x+y=1## and ##x-y=2\,.## These two represent the solution ##x=\frac{3}{2}## and ##y=-\frac{1}{2}\,.## Now if we make e.g. ##4\cdot (i) + 3\cdot (ii)## out of it, we get ##7x+y=10\,.## You have asked why this still carries the same solution, but
$$
4\cdot \left( \frac{3}{2} - \frac{1}{2} \right) + 3\cdot \left( \frac{3}{2} + \frac{1}{2} \right) = 4\cdot 1 + 3 \cdot 2 = 10
$$
is still correct. So if we replaced the variables by the numbers they finally will have, nothing changes: we only add and multiply true equations. So there is no chance to create something wrong. Another question is, do we still have the same information? And here's where @symbolipoint 's "good equations" came in. E.g. we can replace the first equation by our new one ##4(i)+3(ii)## but we must not simultaneously replace the second equation by say ##-4(i)-3(ii)##. In this case we would have created a "bad equation", because we basically substituted both original equations by the same new one. This loses information and would not be allowed. That's why one has to be careful not to lose information this way.
 
  • Like
Likes opus
Let's try the (somewhat abbreviated) text version:
  1. 'equation' means that the expression at the left of the '=' sign is the same (has the same value) as the expression at the right of the '='
  2. an operation, if performed on both sides of the equation, does not invalidate the equation

Therefore, multiplying the first equation in your example by -5 does not change its validity.
It also follows that adding something to both sides of the second equation does not alter its validity either.
The multiplier for the first equation is astutely chosen so that the result, upon addition to the second equation, cancels a coefficient to Zero.

Cheers,
Tom
 
  • Like
Likes opus
Mark44 said:
Because you're adding equal quantities to both sides of the equation you modify.

So let me ask this. Is the reason that we multiply by a variation of another row, and add that to the row we want to change because the numbers in the matrices represent variable coefficients? So say I wanted to change ##a_{21}## from 5 to 1. I cannot simply subtract everything in the row by 4 because those numbers are variable coefficients. It would be like saying ##5x-4=1x## which is not true.
To change ##a_{21}## from 5 to 1, I would need to multiply a row by a factor that, when added to row 2, would change the 5 to a 1.
 
So let me just isolate one equation of the matrix that we'll say is of row 2.
This row consists of numbers, and these numbers correspond to the variable's coefficients of the terms in its respective equation in the system of equations.
In a normal equation, I can add or multiply whatever I want, as long as it's done on both sides.

In the case of the matrix, it is the same except for one aspect. Every number of the LHS of the matrix is being multiplied by some variable. Since this is the case, I can't just add some number to it, even if I do it on both sides of the equation. To add the the LHS, what really needs to be done is to find a multiple of another equation in the matrix and add that to the row we want to change. In doing this, we are adding a multiple of ##x## to ##x##, a multiple of ##y## to ##y##, and a multiple of ##z## to ##z##.
 
Yes to both of your above posts.
 
  • Like
Likes opus
  • #10
Ok then! Thanks!
 
  • Like
Likes Tom.G
  • #11
if you are given the equation 3X = 7, why do the solutions not change if you multiply through by 1/3 ? The exact same reason explains the answer to your matrix question. I.e. performing a matrix row operation is the same as matrix multiplying by an invertible matrix, so it does not change solutions.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 48 ·
2
Replies
48
Views
6K