Hi all, I am having some issues understanding the design of the Bourdon Gauge, used to measure primary vacuum. Here is how we were told it works: A tube is closed on one side, and linked to the vacuum enclosure on the other. When pressure drops, each surface part receives some perpendicular force, which is proportional to the difference between the environment pressure and the enclosure pressure. Then, because of the tube's shape, the external part of the curve has more surface, thus receives more force than the inner part. Thus, the tubes gets more bent, and we measure this shift. However, I do not agree with this theory. Indeed, if it were true, it would be very easy to create some incurved empty wire, which would, because of the outter surface being bigger, receive a non-zero resultant force, and we would have perpetual motion! In fact, what I think is wrong is that we do not consider the force applied on the end of the wire, which, as far as I can see, exacly compensates the resultant force my teacher was talking about. And then we would have no motion at all, because the resultant force on the inner curve, outer curve AND end of the wire would be zero. Am I getting something wrong? Or is this surface difference theory totally wrong? Thank you for your help! Nauhaie PS: I do know that in fact, a Bourdon tube works with the section's geometry being changed by the pressure, but I would like to know if this other theory really is wrong!