I'm so over my head that I don't even know how to ask this, but here goes... The accelerometer in my device (an iPhone) gives an x,y,z vector that points to Earth. I'm using these values to determine the angle of the device relative to the horizon. Visualize a user aiming the device at a point in the sky; the device attempts to calculate the angle from the horizon to the point in the sky. It currently computes this using the "y component to vector magnitude" ratio. Specifically, angle = asin(y / sqrt(x^2 + y^2 + z^2). Now comes the part that's killing me. The user aims their arm, rather than the device, at a point in the sky. The device is attached, via an armband, to the user's pointing arm. Unless the device is attached EXACTLY parallel to the arm, the angle doesn't truly reflect where the arm is pointing. I'm trying to implement a calibration step to compensate for the difference between the device vector and the arm vector, but it's not going so well. Right now my calibration step has the user extending their arm straight out in front and parallel to the ground. I hoped this would tell me how closely the arm's and device's y components align. It seemed obvious enough: if the user's arm is parallel to the ground, their arm is at y = 0; therefore, the offset from arm y to device y must be the current device y reading. By negating the current device y reading we have the transform from device y to arm y. Of course my calibration doesn't work. There are at least two problems with it: the user's arm twists (i.e. the elbow angle relative to the ground isn't fixed) and the device is almost guaranteed to slope toward or away from the arm. Please let me know if you understand what I'm asking (I know I wouldn't) and would like me to elaborate on anything. Thanks.