Best method to solve simultaneous equations?

Click For Summary

Discussion Overview

The discussion revolves around the methods for solving simultaneous equations, particularly focusing on the use of matrices versus traditional elimination techniques. Participants explore the efficiency, applicability, and complexity of various approaches, including matrix inversion and Gaussian elimination.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • Some participants express frustration with the matrix method for solving simultaneous equations, finding it less intuitive and more time-consuming than elimination techniques.
  • Others argue that matrix methods, particularly matrix algebra, are more efficient for larger systems of equations compared to manual elimination methods.
  • One participant highlights the existence of multiple matrix methods, including finding the inverse, Gaussian elimination, LU decomposition, and QR decomposition, each with its own advantages and complexities.
  • Another participant mentions that while finding the inverse can be tedious, it is beneficial when solving multiple equations with the same matrix but different constants.
  • Some participants suggest that understanding matrix inversion is foundational for grasping more advanced techniques in solving systems of equations.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the superiority of one method over another. There are competing views on the efficiency and applicability of matrix methods versus traditional elimination techniques, with some advocating for the former and others for the latter.

Contextual Notes

Participants note that the choice of method may depend on the specific context of the problem, such as whether the equations are encountered once or repeatedly. There are also mentions of the potential for mistakes when using manual methods, which may influence preferences.

Who May Find This Useful

This discussion may be useful for students and practitioners in mathematics or engineering who are exploring different methods for solving simultaneous equations and considering the trade-offs between various techniques.

rollcast
Messages
403
Reaction score
0
I am learning about solving simultaneous equations with matrices but they make less sense to me and take more time than solving the equation using the method rearranging the formulae so that you can eliminate the other variables and then repeat for the other variables?

I'm thinking of giving up on learning about using the matrix method but maybe I've overlooked some aspect of it?
 
Physics news on Phys.org
No you probably haven't overlooked something. Solving systems of equations using a matrix is the exact same thing as solving it by elimination and so on. its just a more compact representation.
 
rollcast said:
I am learning about solving simultaneous equations with matrices but they make less sense to me and take more time than solving the equation using the method rearranging the formulae so that you can eliminate the other variables and then repeat for the other variables?

I'm thinking of giving up on learning about using the matrix method but maybe I've overlooked some aspect of it?

I can guarantee you that solving systems of linear equations with matrix algebra is FAR more efficient than with simple elimination by hand.

Try this: Solve the following system of equations:


3x-y+z-w=0
2x-w+z=2
4x-5y-z=-1
x+y-w=3


a) With substitution or elimination (Your preference)
b) With matrices

Time yourself. It's ok if you give up 20 mins in part a. You're also more likely to make mistakes.

Moral of the post: Matrices exist for a very good reason; they're easy to compute with and faster too. Especially when you have 3+ equations in 3+ unknowns.
 
Last edited by a moderator:
What matrix method? There are several: Finding the inverse of the matrix, some form of elimination technique, some form of matrix decomposition, etc.

Your technique of elimination is essentially Gaussian elimination, but without the benefits of Gaussian elimination. LU decomposition is a step above Gaussian elimination, both in terms of time consumption and stability. QR decomposition is a bit more expensive computationally but has advantages in terms of stability and reusability. There are many other techniques, several of them quite sophisticated because there are lots of ways to get into trouble in solving simultaneous equations.
 
DivisionByZro said:
I can guarantee you that solving systems of linear equations with matrix algebra is FAR more efficient than with simple elimination by hand.

Try this: Solve the following system of equations:


3x-y+z-w=0
2x-w+z=2
4x-5y-z=-1
x+y-w=3


a) With substitution or elimination (Your preference)
b) With matrices

Time yourself. It's ok if you give up 20 mins in part a. You're also more likely to make mistakes.

Moral of the post: Matrices exist for a very good reason; they're easy to compute with and faster too. Especially when you have 3+ equations in 3+ unknowns.

I clocked 15 minutes on that and probably made a mistake. touche! haha.
To be honest, when I wrote my response I had assumed that the OP wasn't doing systems of 4 equations.
 
I'm using the inverse method.

I think I can see the benefit of it now, is their another matrix method other than taking the inverse as it is confusing me a bit and I think that is maybe what is confusing me?
 
If it's a one shot deal (i.e., you are given a set of equations that you need to solve, never to see anything like them again) then finding the inverse is a bit much. Elimination is about 3x faster. On the other hand, if you are going to see several different sets of equations, all with the same right hand side but different left hand sides, you can compute the inverse once and then reuse it. Now knowing the inverse is a big plus.

Knowing how matrix inversion works is also a first step toward understanding those more advanced techniques. Those more advanced techniques will only make sense if you know the basics of matrix manipulations.

Its a bit like learning to do derivatives using the epsilon-delta formulation. You need to know that formulation to truly understand differentiation. Once you understand it you can pretty much forget it -- until you need to do numerical differentiation of some unknown function. Then it comes in pretty handy.
 
By the way, in applications that result in equations like "Ax= b", the matrix A tends to be 'structural'- that is, depending on basic properties of the problem while b tends to depend on special properties. Typically, then, one has to solve large numbers of equations having the same "A" but different "b"s. While finding [itex]A^{-1}[/itex] may be tedious, you only have to do it once and then multiply that same matrix by the various b values.
 

Similar threads

  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
7K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
10
Views
2K
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K