Calculating distance between two Accelerometers

  • Thread starter Thread starter snocavotia
  • Start date Start date
AI Thread Summary
Using one accelerometer at ground level and another in hand to calculate height based on gravity differences is impractical due to insufficient sensitivity and local mass variations. For tracking an object between 2 to 10 feet away, sonar calibrated to pressure and temperature is suggested for high accuracy. Relying solely on accelerometers for motion tracking is challenging, as small errors in acceleration can lead to significant inaccuracies in position over time. Experiments with smartphone accelerometers confirmed that position accuracy deteriorates quickly, often exceeding several meters after short durations. Combining multiple tracking methods is typically recommended to enhance accuracy.
snocavotia
Messages
11
Reaction score
0
Is it possible to have one accelerometer placed at ground level and one in your hand. and without either moving use the difference in Gravity's pull to calculate distance or rather height of the second accelerometer?
 
Physics news on Phys.org
Not really. Accelerometers aren't sensitive enough to begin with, but even if you take a sufficiently sensitive gravity probe, the variation due to local masses will be greater than variation due to height change.
 
well then i have another question, I've been trying to find a way to track the location of something that is > 2ft away < 10 ft away. is there any way to do this, i have looked at radio triangulation, light, etc but i cannot find anything.
 
If you calibrate it to pressure and temperature, a sonar should give you very high accuracy in such a range.
 
I'm not sure exactly what you're doing, but using strictly accelerometers to track the motion of something is a lot more difficult than it would seem, even if the thing is moving. Theoretically, if you know the acceleration and initial conditions, you can easily find the position by integrating twice. The problem is, when you do it numerically, tiny errors in acceleration lead to enormous errors in position. Some friends and I played around with tracking a smart phone using its accelerometer but found that after even 10 seconds or so, the position wasn't even accurate to within a few metres, and it gets much worse as time progresses. We didn't do much more in the way of actual testing, but research found that sonar popped up a lot, and we also found that usually a variety of methods are coupled together, which can improve accuracy.
 
I have recently been really interested in the derivation of Hamiltons Principle. On my research I found that with the term ##m \cdot \frac{d}{dt} (\frac{dr}{dt} \cdot \delta r) = 0## (1) one may derivate ##\delta \int (T - V) dt = 0## (2). The derivation itself I understood quiet good, but what I don't understand is where the equation (1) came from, because in my research it was just given and not derived from anywhere. Does anybody know where (1) comes from or why from it the...
Back
Top