Can I use back substitution to invert a matrix?

  • Thread starter Thread starter NewStudent200
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
Back substitution can be used to invert a triangular matrix, which simplifies the process. The inverse of a triangular matrix retains a similar structure, allowing for straightforward calculations. The discussion highlights that for each element in the matrix, specific equations must be solved to find the corresponding elements of the inverse. The method is efficient and recommended for triangular matrices. In conclusion, back substitution is a practical approach for inverting such matrices.
NewStudent200
Messages
5
Reaction score
0
Hello all,

Say I had n equations, with n variables, so that
C1 = C1(f1), C1 is a funtion of f1 only
C2 = C2(f1, f2), C2 is a function of f1 and f2
...
Cn = Cn(f1, f2...fn), Cn is a function of all n vairables

I can calculate the matrix dC/ df where each line is the derivative of Ci with respect to the n vairables, The matrix will look something lile

a 0 0 0... 0
b c 0 0... 0
...
u v w x... z

Now if I was after the matrix df/dC then am I basically just trying to invert this matrix?

Thanks very much for any help
 
Physics news on Phys.org
NewStudent200 said:
Hello all,

Say I had n equations, with n variables, so that
C1 = C1(f1), C1 is a funtion of f1 only
C2 = C2(f1, f2), C2 is a function of f1 and f2
...
Cn = Cn(f1, f2...fn), Cn is a function of all n vairables

I can calculate the matrix dC/ df where each line is the derivative of Ci with respect to the n vairables, The matrix will look something lile

a 0 0 0... 0
b c 0 0... 0
...
u v w x... z

Now if I was after the matrix df/dC then am I basically just trying to invert this matrix?

Thanks very much for any help
Yes, and, for a triangular matrix, that is fairly simple.
 
Great. Thanks very much. Is there a particular algorithm that is recommended for a triangular matrix?

Regards,
 
Just "back substitution". If your matrix is, say,
\begin{bmatrix}a & 0 & 0 \\ b & c & 0\\ d & e & f\end{bmatrix}
then its inverse will be a matrix of the form
\begin{bmatrix}u & 0 & 0 \\ v & w & 0\\ x & y & z\end{bmatrix}
such that
\begin{bmatrix}a & 0 & 0 \\ b & c & 0\\ d & e & f\end{bmatrix}\begin{bmatrix}u & 0 & 0 \\ v & w & 0\\ x & y & z\end{bmatrix}= \begin{bmatrix}1 & 0 & 0 \\0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}
You must have au= 1 so u= 1/a. Then bu+ cv= b/a+ cv= 0 so v= -b/ac. cw= 1 so w= 1/c, etc.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
13
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 21 ·
Replies
21
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K