Accelerometer's angle from ground to sky

  • Thread starter Thread starter craftingcode
  • Start date Start date
  • Tags Tags
    Angle Ground Sky
AI Thread Summary
The discussion focuses on the challenges of calibrating an accelerometer attached to a user's arm for accurate angle measurement relative to the horizon. The current calibration method, which involves pointing the arm straight out and parallel to the ground, fails due to the arm's natural twisting and the device's potential misalignment. Suggestions include storing the full 3D vector of the device's orientation when the arm is pointed down, rather than just the y-component, to improve accuracy. This approach allows for the calculation of the angle between the stored down-direction and the current accelerometer reading, regardless of the device's orientation on the arm. Proper calibration is essential for achieving reliable angle measurements in this setup.
craftingcode
Messages
2
Reaction score
0
I'm so over my head that I don't even know how to ask this, but here goes...

The accelerometer in my device (an iPhone) gives an x,y,z vector that points to Earth. I'm using these values to determine the angle of the device relative to the horizon. Visualize a user aiming the device at a point in the sky; the device attempts to calculate the angle from the horizon to the point in the sky. It currently computes this using the "y component to vector magnitude" ratio. Specifically, angle = asin(y / sqrt(x^2 + y^2 + z^2).

Now comes the part that's killing me. The user aims their arm, rather than the device, at a point in the sky. The device is attached, via an armband, to the user's pointing arm. Unless the device is attached EXACTLY parallel to the arm, the angle doesn't truly reflect where the arm is pointing. I'm trying to implement a calibration step to compensate for the difference between the device vector and the arm vector, but it's not going so well.

Right now my calibration step has the user extending their arm straight out in front and parallel to the ground. I hoped this would tell me how closely the arm's and device's y components align. It seemed obvious enough: if the user's arm is parallel to the ground, their arm is at y = 0; therefore, the offset from arm y to device y must be the current device y reading. By negating the current device y reading we have the transform from device y to arm y. Of course my calibration doesn't work. There are at least two problems with it: the user's arm twists (i.e. the elbow angle relative to the ground isn't fixed) and the device is almost guaranteed to slope toward or away from the arm.

Please let me know if you understand what I'm asking (I know I wouldn't) and would like me to elaborate on anything.

Thanks.
 
Physics news on Phys.org
craftingcode said:
There are at least two problems with it: the user's arm twists (i.e. the elbow angle relative to the ground isn't fixed)
Calibration : Point the arm down to the ground vertically, and remember that accelerometer vector as your pointing direction vector.

Later, to get the angle over horizon : Compute the angle between the current accelerometer vector and the saved pointing direction vector minus 90°.
craftingcode said:
and the device is almost guaranteed to slope toward or away from the arm.
That cannot be solved by the program. The device must be fixed rigidly to the arm.
 
Last edited:
A.T. said:
Calibration : Point the arm down to the ground vertically, and remember that accelerometer vector as your pointing direction vector.

Later, to get the angle over horizon : Compute the angle between the current accelerometer vector and the saved pointing direction vector minus 90°.

That cannot be solved by the program. The device must be fixed rigidly to the arm.

A.T, thanks for taking the time to read and reply to my post.

I tried calibrating with the arm pointing down and had the same problems. The extreme case makes it easiest to visualize. Extreme case: user mounts the device at 90 degrees (i.e. the x reading, rather than the y reading, runs parallel to the arm). Now imagine that the user calibrates with their hand flat (like a karate chop) at their side. As they raise their hand to point to the horizon, consider the case where their hand rotates 90 degrees. Now there is a discrepancy between the device's calibration orientation and its runtime orientation. When calibrated the y acceleration ran along the yz plane; at computation time the y acceleration runs along the xz plane. In this extreme case the computed accelerometer y difference is zero even though the two positions are very different.

Sorry about not being clear with my use of the term slope. I didn't mean that the device vacillates or wobbles; the device is assumed to be rigidly attached to the arm. I meant how flatly the device was attached to the arm. Unless the users arm is perfectly flat, one side of the device, top or bottom, is going to be closer than the other to the arms center. Said differently, the radius of the arm isn't constant; this causes the device to slope toward or away from the elbow when attached via an armband.

Thanks again.
 
craftingcode said:
A.T, thanks for taking the time to read and reply to my post.

I tried calibrating with the arm pointing down and had the same problems.

If I understand your calibration correctly, you stored just an offset number (y). That is not enough information. You have to store the entire 3D-vector of down-direction during calibration (hand points down). And during operation compute the 3D-angle between two 3D-vectors: calibration-down-direction and current-down-direction.

With the above method it is completely irrelevant how the device is oriented relative to the arm, as long as it keeps that orientation.
 
Last edited:
Hi there, im studying nanoscience at the university in Basel. Today I looked at the topic of intertial and non-inertial reference frames and the existence of fictitious forces. I understand that you call forces real in physics if they appear in interplay. Meaning that a force is real when there is the "actio" partner to the "reactio" partner. If this condition is not satisfied the force is not real. I also understand that if you specifically look at non-inertial reference frames you can...
I have recently been really interested in the derivation of Hamiltons Principle. On my research I found that with the term ##m \cdot \frac{d}{dt} (\frac{dr}{dt} \cdot \delta r) = 0## (1) one may derivate ##\delta \int (T - V) dt = 0## (2). The derivation itself I understood quiet good, but what I don't understand is where the equation (1) came from, because in my research it was just given and not derived from anywhere. Does anybody know where (1) comes from or why from it the...
Back
Top