Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Accelerometer's angle from ground to sky

  1. Mar 7, 2010 #1
    I'm so over my head that I don't even know how to ask this, but here goes...

    The accelerometer in my device (an iPhone) gives an x,y,z vector that points to Earth. I'm using these values to determine the angle of the device relative to the horizon. Visualize a user aiming the device at a point in the sky; the device attempts to calculate the angle from the horizon to the point in the sky. It currently computes this using the "y component to vector magnitude" ratio. Specifically, angle = asin(y / sqrt(x^2 + y^2 + z^2).

    Now comes the part that's killing me. The user aims their arm, rather than the device, at a point in the sky. The device is attached, via an armband, to the user's pointing arm. Unless the device is attached EXACTLY parallel to the arm, the angle doesn't truly reflect where the arm is pointing. I'm trying to implement a calibration step to compensate for the difference between the device vector and the arm vector, but it's not going so well.

    Right now my calibration step has the user extending their arm straight out in front and parallel to the ground. I hoped this would tell me how closely the arm's and device's y components align. It seemed obvious enough: if the user's arm is parallel to the ground, their arm is at y = 0; therefore, the offset from arm y to device y must be the current device y reading. By negating the current device y reading we have the transform from device y to arm y. Of course my calibration doesn't work. There are at least two problems with it: the user's arm twists (i.e. the elbow angle relative to the ground isn't fixed) and the device is almost guaranteed to slope toward or away from the arm.

    Please let me know if you understand what I'm asking (I know I wouldn't) and would like me to elaborate on anything.

  2. jcsd
  3. Mar 8, 2010 #2


    User Avatar
    Science Advisor

    Calibration : Point the arm down to the ground vertically, and remember that accelerometer vector as your pointing direction vector.

    Later, to get the angle over horizon : Compute the angle between the current accelerometer vector and the saved pointing direction vector minus 90°.
    That cannot be solved by the program. The device must be fixed rigidly to the arm.
    Last edited: Mar 8, 2010
  4. Mar 8, 2010 #3
    A.T, thanks for taking the time to read and reply to my post.

    I tried calibrating with the arm pointing down and had the same problems. The extreme case makes it easiest to visualize. Extreme case: user mounts the device at 90 degrees (i.e. the x reading, rather than the y reading, runs parallel to the arm). Now imagine that the user calibrates with their hand flat (like a karate chop) at their side. As they raise their hand to point to the horizon, consider the case where their hand rotates 90 degrees. Now there is a discrepancy between the device's calibration orientation and its runtime orientation. When calibrated the y acceleration ran along the yz plane; at computation time the y acceleration runs along the xz plane. In this extreme case the computed accelerometer y difference is zero even though the two positions are very different.

    Sorry about not being clear with my use of the term slope. I didn't mean that the device vacillates or wobbles; the device is assumed to be rigidly attached to the arm. I meant how flatly the device was attached to the arm. Unless the users arm is perfectly flat, one side of the device, top or bottom, is going to be closer than the other to the arms center. Said differently, the radius of the arm isn't constant; this causes the device to slope toward or away from the elbow when attached via an armband.

    Thanks again.
  5. Mar 8, 2010 #4


    User Avatar
    Science Advisor

    If I understand your calibration correctly, you stored just an offset number (y). That is not enough information. You have to store the entire 3D-vector of down-direction during calibration (hand points down). And during operation compute the 3D-angle between two 3D-vectors: calibration-down-direction and current-down-direction.

    With the above method it is completely irrelevant how the device is oriented relative to the arm, as long as it keeps that orientation.
    Last edited: Mar 8, 2010
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook