Does Leaving Torque Wrenches at High Settings Affect Calibration Accuracy?

Click For Summary
SUMMARY

Leaving torque wrenches set at high settings, such as 50-75% of their maximum, does not immediately affect calibration accuracy, as evidenced by testing with an accratorque rig showing minimal discrepancies. However, the underlying theory suggests that prolonged compression of the spring can lead to wear and creep, ultimately compromising the tool's precision over time. Aerospace engineers and maintenance personnel should adhere to the practice of returning torque wrenches to their lowest settings to ensure long-term accuracy and reliability.

PREREQUISITES
  • Understanding of torque wrench mechanics and calibration
  • Familiarity with spring dynamics and material creep
  • Experience with accratorque testing rigs
  • Knowledge of aerospace engineering maintenance protocols
NEXT STEPS
  • Research the effects of spring creep on torque wrench accuracy
  • Learn about different types of torque wrenches and their calibration methods
  • Explore best practices for maintaining precision tools in aerospace engineering
  • Investigate the specifications and maintenance schedules for accratorque rigs
USEFUL FOR

Aerospace engineers, maintenance technicians, and anyone involved in the calibration and upkeep of precision torque wrenches will benefit from this discussion.

shaun_598
Messages
8
Reaction score
0
Hi,

I hope this is in the correct section, please feel free to move it or advise me to move if it's in the incorrect section.

I work as an aerospace engineer and a sideline of what i do is to maintain the torque wrenches we use on the equipment. We have a set schedule for maintenance every 3 months on where we test them with a 10% tolerance using a accratorque rig.

We get told like I am sure most others do, that you should return the torque wrench to its lowest setting to prevent the wrench from becoming inaccurate. However, I've recently left some of the wrenches set at their in use settings, some at 50-75% of their max setting. I've found that upon testing them at the 3 month intervals they showed very little discrepancies just like those returned to the low settings.

Im curious what is the theory behind winding them back to zero? And can anyone explain why there was no difference when i left them set?

Thanks
 
Engineering news on Phys.org
It depends on the type of wrench, but in general they use springs, for their relatively consistent spring constant, to set the torque. By leaving the spring compressed to say, 50-60% of it's maximum rated setting, you are leaving that spring in compression for an extended period of time.

Springs wear out, and are subject to creep just like any other material. Leaving the spring in compression for extended periods of time (or numerous different times) will lead to changes in the springs characteristics. This will eventually cause it to become innacurate, and likely very difficult to even calibrate properly.

It's not going to happen instantly, as you've seen after leaving yours set to torque, but these are "precision" tools and you want to take care of them.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
81K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
11K
Replies
5
Views
1K
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K