- #1
Redoctober
- 48
- 1
I was wondering about a scenario where we have a unit mass M1 and a thin rod of mass M2 of length L with a distance r between them attracting each other . Is my following approach to the problem correct ??!
let s = r + L
I know that F= -(GM1M2)/(s^2) where G is gravitation constant
therefore F= -GM1*∫1/(s^2).dM2
I know that M2 = λ*L where λ is density
therefore dM2 = λdL . from ds = dL
I get dM2 = λdL
Therefore finally , F= -GM1*∫1/(s^2)*λ.ds
Integrating from r to r+L
I get F= -(GM1M2)/(r*(r+L))
Equation analysis - If i put limit L-->0 , i turn the rod to a point mass, therefore i get -GM1M2/(r^2) which is actually the gravitational attraction force for two masses :D
Thanks in advance :)
let s = r + L
I know that F= -(GM1M2)/(s^2) where G is gravitation constant
therefore F= -GM1*∫1/(s^2).dM2
I know that M2 = λ*L where λ is density
therefore dM2 = λdL . from ds = dL
I get dM2 = λdL
Therefore finally , F= -GM1*∫1/(s^2)*λ.ds
Integrating from r to r+L
I get F= -(GM1M2)/(r*(r+L))
Equation analysis - If i put limit L-->0 , i turn the rod to a point mass, therefore i get -GM1M2/(r^2) which is actually the gravitational attraction force for two masses :D
Thanks in advance :)
Last edited: