Get Azimuth & Elevation from iPhone Accelerometer Data

AI Thread Summary
Extracting azimuth and elevation from iPhone accelerometer data during a throwing motion is challenging due to the limitations of the data, which only provides linear acceleration in the device's local space. Without knowledge of the device's orientation in global space, reconstructing a 3D path is not feasible. Assumptions can be made to simplify the modeling, such as starting with zero acceleration and recording data during significant changes in acceleration. However, achieving an accurate 3D trajectory requires additional sensors like gyroscopes, which the iPhone lacks for this purpose. The discussion suggests exploring alternative examples for teaching physics concepts effectively.
rd42
Messages
6
Reaction score
0
I'm almost to timid to ask, I haven't had a math class in over 10 years and even longer for physics. My apologies if I dumb down the forum a little bit.

I'm grabbing accelerometer data out of the iPhone of someone doing a throwing motion. I'm not really sure if it is possible to get the azimuth and elevation from the accelerometer data, and if it is, whether I have the computational skills to work the formulas in a reasonable amount of time.

Are there any generic excel spreadsheets or formulas for plotting information about what the iPhone might be doing in 3D space from just the accelerometer measurements?

Thanks you for any help or hints.

Robert
 
Physics news on Phys.org
rd42 said:
I'm grabbing accelerometer data out of the iPhone of someone doing a throwing motion. I'm not really sure if it is possible to get the azimuth and elevation from the accelerometer data,
You only get one vector (linear net inertial force) from the accelerometer, no angular acceleration, right? In that case you cannot reconstruct the 3D path, if the iPhone is rotating in space.
 
I'm not sure. You get acceleration in x,y and z over time.
 
rd42 said:
I'm not sure. You get acceleration in x,y and z over time.

This is the acceleration in the device's local space. If you don't know how the device is oriented in global space, you cannot compute the path in global space in a general way. But depending on the expected movement and what you want to get out of this, you can make some assumptions.
 
Excellent, I like assumptions :) and compounding error is cool too.

I will definitely be making assumptions. I'm trying to model a baseball toss for some young kids studying physics. I'm not sure what other assumptions I will need, but I will assume a starting accel of zero, I will start recording when the accelerometer senses a substantial acceleration increase (the start of the throw) and stop recording when there is a substantial decrease in acceleration. The students will be instructed not to follow through on the throw and not to let go of the iPhone.
 
Like AT said, you can't get an accurate 3D trajectory unless you also have 3 gyros in the phone in additional to 3 accelerometers. Even if you have all these, it would be a big challenge to make them work. iPhone can't be turn into an inertial navigation system yet.
 
Oh well, thanks for your help.


I wonder what a better example might be for the students.
 
Back
Top