Hi there. I have a GForce sensor (accelerometer to be precise). If I hold it in its initial orientation then x = 0G, y = 0G, z = 1G (x = left and right, y = forwards and backwards, z = up and down) Now i'd like to calibrate it (perhaps the wrong term, but u get the idea)... meaning that if I for example hold it up a little (45 degrees up on the Y Axis), i'd like that to be the "zero" position (x = 0, y = 0.5, z = 0.5). Notice how Z decreases because when I tilt Y upwards, Z is no longer facing directly up anymore. Essentially, if I receive a reading from the device that says (0,0.5,0.5), I know that I need to apply the offset and the net result should be (0,0,1). Real world example: If the user sticks the accelerometer on their car windscreen so they can see the output screen, it should correct the values to tell them the GForce values from the car's perspective, not the accelerometer's. So under braking, even though the accelerometer is experiencing acceleration on Y and Z, in reality the car is experiencing only Y (well, in simplified reality if we ignore the downward movement from shock absorbers). The problem is that my understanding of vectors and geometry etc is very limited, and I don't know where to start with coming up with an approach that will work for any orientation. I have been exposed to vectors etc at university, but that was a while back, only in first year math, and i'm VERY rusty. I can't seem to find a resolution on the net, and i've been trying different calculations in excel that only seem to work for a 2 axis scenario (not using any trig, just simple math). Any assistance would be greatly appreciated.