Spring static equilibrium Problem

Click For Summary
The discussion centers on solving a spring static equilibrium problem involving two masses and springs. Participants clarify the equations of motion for the system, correcting initial misconceptions about the role of gravitational forces and the relationship between the displacements of the masses and spring extensions. The correct second-order differential equations are established as m2*ddot(y2) = -k2(y2 - y1) and m1*ddot(y1) = k2(y2 - y1) - k1*y1. The conversation then shifts to solving these equations, with suggestions to use matrix methods and eigenvalues to simplify the problem. The thread concludes with participants exploring the implications of their findings and the methods for solving the resulting equations.
  • #31
I have an idea Let's say $$m_1\ddot y_1 = k_2(y_2-y_1)-k_1y_1$$ and $$m_2\ddot y_2 = -k_2(y_2-y_1)$$ Let's substract both sides

$$m_1\ddot y_1-m_2\ddot y_2=2k_2(y_2-y_1)-k_1y_1$$ we know that ##m_1=m_2## that's given actually...and also k values.

so we have $$Y=y_2-y_1$$

$$d^2Y/dt^2=\frac {1} {m} (2k_2Y- k_1y_1)$$ ?
 
Physics news on Phys.org
  • #32
Arman777 said:
I have an idea Let's say $$m_1\ddot y_1 = k_2(y_2-y_1)-k_1y_1$$ and $$m_2\ddot y_2 = -k_2(y_2-y_1)$$ Let's substract both sides

$$m_1\ddot y_1-m_2\ddot y_2=2k_2(y_2-y_1)-k_1y_1$$ we know that ##m_1=m_2## that's given actually...and also k values.

so we have $$Y=y_2-y_1$$

$$d^2Y/dt^2=\frac {1} {m} (2k_2Y- k_1y_1)$$ ?
You are no better off since you have both Y and y1.
Try my suggestion in post #27.
(I think it is effectively the same as Orodruin's method.)
 
Last edited:
  • #33
I think @ Orodruin method might be the easiest you write the system as
##
\begin{pmatrix}
\ddot {Y_1}\\
\ddot {Y_2}
\end{pmatrix}
=k \begin{pmatrix}
Y_1\\
Y_2
\end{pmatrix} \\
##
Where k is
##
\begin{pmatrix}
-(k_1 + k_2)/m_1 & k_2/m_1 \\
K_2/m_2 & -k_2/m_2
\end{pmatrix}
##
If you expand this out you will get back the original system
So to solve this you can diagonalize it and get two uncoupled first order ode which you can solve
 
Last edited:
  • Like
Likes Delta2
  • #34
timetraveller123 said:
I think @ Orodruin method might be the easiest you write the system as
##
\begin{pmatrix}
\ddot {Y_1}\\
\ddot {Y_2}
\end{pmatrix}
=k \begin{pmatrix}
Y_1\\
Y_2
\end{pmatrix} \\
##
Where k is
##
\begin{pmatrix}
-(k_1 + k_2)/m_1 & k_2/m_1 \\
K_2/m_2 & -k_2/m_2
\end{pmatrix}
##
If you expand this out you will get back the original system
So to solve this you can diagonalize it and get two uncoupled first order ode which you can solve
I ll going to try it. But I am not hopefull that I can do it :)
Is these type of problems are common in CM classes ?
 
  • #35
i am not sure i haven't take those classes
do you know about eigenvectors and stuff if you do then i think this might be the fastest way
 
  • #36
Today in lecture our teacher made a solution to this problem by using the Matrix approach as Orodruin pointed out. He write the matrix form then he found the eigenvalues and eigenvectors. And then well he kind of stopped there.
 
  • #37
then maybe you should try to take it from there do you know how to diagonalize a matrix
 
  • #38
timetraveller123 said:
then maybe you should try to take it from there do you know how to diagonalize a matrix
Once he has the eigenvalues and eigenvectors, there is no need to diagonalize anything. All he needs to do is use these to determine the coefficients required to satisfy the initial conditions.
 
  • Like
Likes timetraveller123
  • #39
Chestermiller said:
Once he has the eigenvalues and eigenvectors, there is no need to diagonalize anything. All he needs to do is use these to determine the coefficients required to satisfy the initial conditions.
Well, the process of finding the eigenvalues and eigenvectors essentially gives you the diagonalisation as well ...
 
  • Like
Likes timetraveller123
  • #40
wait even after obtaining eigenvectors and eigenvalues
you still have to change basis from
##
y' = p^{-1} y
##
and construct a diagonal matrix filled with the eigenvalues right? at least this is what i know
then after solving that revert back to normal basis
 
  • #41
timetraveller123 said:
wait even after obtaining eigenvectors and eigenvalues
you still have to change basis from
##
y' = p^{-1} y
##
and construct a diagonal matrix filled with the eigenvalues right? at least this is what i know
You don't really have to do it. You just note that with a complete set of eigenvectors ##v_i##, you can expand the solution in terms of them, i.e.,
$$
y(t) = \sum_i \alpha_i(t) v_i.
$$
Now, inserting into the differential equation would give
$$
\ddot y = \sum_i \ddot{\alpha}_i(t) v_i = K \sum_i \alpha_i(t) v_i = \sum_i \lambda_i \alpha_i(t) v_i,
$$
where ##\lambda_i## are the eigenvalues. Since the ##v_i## are linearly independent, the coefficients in front of ##v_i## on either side of the equation must be the same and therefore
$$
\ddot \alpha_i = \lambda_i \alpha_i.
$$
Of course, this is the same thing as you will get if you do the diagonalisation explicitly.
 
  • Like
Likes Delta2, Arman777 and timetraveller123
  • #42
oh wow that's actually rather neat i never learned it that way
 
  • Like
Likes Delta2
  • #43
I find something like

##\begin{pmatrix}
\ddot y_1 \\
\ddot y_2 \\
\end{pmatrix} =
\begin{pmatrix}
-10 & 4 \\
4 & -4 \\
\end{pmatrix}
\begin{pmatrix}
y_1 \\
y_2 \\
\end{pmatrix}##

I find the values ##λ_1=-12## and ##λ_2=-2##

Correct ?
 
Last edited:
  • #44
were you given the values
the eigen values seem correct
 
  • #45
timetraveller123 said:
were you given the values
the eigen values seem correct
Thanks :)
 
  • #46
Orodruin said:
You don't really have to do it. You just note that with a complete set of eigenvectors ##v_i##, you can expand the solution in terms of them, i.e.,
$$
y(t) = \sum_i \alpha_i(t) v_i.
$$
Now, inserting into the differential equation would give
$$
\ddot y = \sum_i \ddot{\alpha}_i(t) v_i = K \sum_i \alpha_i(t) v_i = \sum_i \lambda_i \alpha_i(t) v_i,
$$
where ##\lambda_i## are the eigenvalues. Since the ##v_i## are linearly independent, the coefficients in front of ##v_i## on either side of the equation must be the same and therefore
$$
\ddot \alpha_i = \lambda_i \alpha_i.
$$
Of course, this is the same thing as you will get if you do the diagonalisation explicitly.

So ##\ddot y_1 = -12y_1## and ##\ddot y_2 = -2y_2## or which eigenvalue corresponds to which ?
 
  • #47
Arman777 said:
So ##\ddot y_1 = -12y_1## and ##\ddot y_2 = -2y_2## or which eigenvalue corresponds to which ?
No, you need to use the eigenvectors. The equations where the differential equations are not coupled are the ones for the ##\alpha##s, not for the ##y##s.
 
  • #48
For ##λ_1=-12## I find eigenvector
##\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}##

and for ##λ_1=-2## I find
## \begin{pmatrix}
1 \\
2 \\
\end{pmatrix}##

so ##\ddot y_1=
-12
\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}##

##\ddot y_2=-2
\begin{pmatrix}
1 \\
2 \\
\end{pmatrix}## ?
 
  • #49
No. You need to write your differential equations on the form
$$
\ddot Y = \begin{pmatrix}
\ddot y_1 \\ \ddot y_2
\end{pmatrix}
=
\ddot \alpha_1 v_1 + \ddot \alpha_2 v_2
= \lambda_1 \alpha_1 v_1 + \lambda_2 \alpha_2 v_2.
$$
This will give you differential equations for the ##\alpha##s, not for the ##y##s.
 
  • #50
What is ##α## ??

I am so confused right now. ##ν## are the eigenvectors okay ##λ## is the eigenvalue.

Its so sad that our teacher never solved a problem like this before. Even once and I guess my algebra sucks.
 
  • #51
The ##\alpha## are the expansion coefficients that tell you how much of each eigenvector there is in the solution. Generally those coefficients will be time dependent. The idea is that any vector
$$
Y =
\begin{pmatrix}
y_1 \\ y_2
\end{pmatrix}
$$
can be written as a linear combination of the eigenvectors
$$
Y = \alpha_1(t) v_1 + \alpha_2(t) v_2
$$
where the expansion coefficients generally depend on time. Inserting this into the differential equation gives you separated differential equations for the ##\alpha##, i.e., the differential equation for ##\alpha_1## does not depend on ##\alpha_2## and vice versa.
 
  • #52
So we have
For

##\begin{pmatrix}
\ddot y_1 \\
\ddot y_2 \\
\end{pmatrix}=-12α_1
\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}
-2α_2 \begin{pmatrix}
1 \\
2 \\
\end{pmatrix}##
 
  • #53
You need to insert the expression for Y in terms of the alphas on the left side as well.
 
  • #54
Orodruin said:
You need to insert the expression for Y in terms of the alphas on the left side as well.
Could you write it please..so I can learn it.. I don't get it this way. I need to proceed . This is painful
 
Last edited:
  • #55
##\begin{pmatrix}
y_1 \\
y_2 \\
\end{pmatrix}=C_1e^{-12t}
\begin{pmatrix}
2 \\
-1 \\
\end{pmatrix}+
C_2e^{-2t} \begin{pmatrix}
1 \\
2 \\
\end{pmatrix}##
 
Last edited:
  • #56
That would be the result if you had a first order derivative and not a second order one in your differential equation.
 
  • #57
Orodruin said:
You need to insert the expression for Y in terms of the alphas on the left side as well.
You mean it will be ##\ddot α_1v_1## etc
 
  • #58
i think what @Orodruin menas is that the aplhas you got in post 56 would be the solution of
##
\dot \alpha_i = \lambda_i \alpha_i
##
instead of
##
\ddot \alpha_i = \lambda_i \alpha_i
##
you wouldn't get an exponential solution if you used the second one
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
1K
Replies
2
Views
2K
Replies
6
Views
1K
Replies
3
Views
2K
Replies
6
Views
2K
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
3
Views
2K