How to select the required accuracy of an instrument?

Click For Summary
SUMMARY

To determine the required accuracy of a measuring instrument for component dimensions, especially when tolerances are unspecified, it is essential to aim for a Test Accuracy Ratio (TAR) of 4:1. For example, if the valve stem diameter is specified at 5.973mm, assuming a tolerance of +/-0.0005mm leads to a necessary instrument accuracy of at least +/-0.000125mm. The GUM: Guide to the Expression of Uncertainty in Measurement is a critical resource for understanding measurement accuracy and uncertainty. Additionally, standard tolerances in the machining industry provide benchmarks for accuracy when specific tolerances are not provided.

PREREQUISITES
  • Understanding of Test Accuracy Ratio (TAR)
  • Familiarity with GUM: Guide to the Expression of Uncertainty in Measurement
  • Knowledge of standard tolerances in the machining industry
  • Basic principles of measurement traceability
NEXT STEPS
  • Research the GUM: Guide to the Expression of Uncertainty in Measurement for detailed methodologies.
  • Explore standard tolerances in the machining industry to understand common practices.
  • Learn about measurement traceability and its importance in precision engineering.
  • Investigate the impact of environmental factors, such as temperature, on measurement accuracy.
USEFUL FOR

Engineers, precision measurement technicians, quality control professionals, and anyone involved in component manufacturing and compliance testing will benefit from this discussion.

fonz
Messages
151
Reaction score
5
TL;DR
Instrument accuracy and precision
If I need to make a measurement to check compliance e.g. measuring component dimensions. How do I know what accuracy is required for the measuring instrument if the tolerance is not specified?

For example, during an engine rebuild, the manufacturer specifies the valve stem diameter to be 5.973mm. Clearly the instrument needs a resolution of at least 0.001mm, but is that enough?

I am aware that in most cases, where a tolerance is specified it is typical to aim for at Test Accuracy Ratio (TAR) of 4:1. In this case the tolerance is not specified so is it correct to assume the tolerance is +/-0.0005mm? Therefore the instrument must have an accuracy of at least +/-0.000125mm?

Thanks
 
Engineering news on Phys.org
Paging @Ranger Mike, he probably has more direct knowledge than the below.

This will probably answer more questions than you thought existed!

GUM: Guide to the Expression of Uncertainty in Measurement


https://www.bipm.org/documents/2012...f-3f85-4dcd86f77bd6?version=1.7&download=true

(above found with:
https://www.google.com/search?&q=G.U.M.+guide+to+measurement)

Have fun, it's 'only' 134 pages.

Cheers,
Tom

p.s. It has been years since I read that document, but your accuracy assumption seems reasonable for a 'failsafe', can't tolerate ANY failure, situation.
 
There may be more tolerance in one direction than in the other, too ##-## e.g. a machine screw that's a tiny bit too small could be tolerably loose for its intended purpose, but one that's by the same amount a bit too big might just plain not fit without damaging the threads ##-## I think that for engine rebuild purposes, it would be helpful if the spec sheet were to state the absolute and recommended tolerances, along with the target value ##-## @Mark44 has a great deal of actual experience in rebuilding engines ##-## e.g. see this thread (I think that for him that's more of a personal pursuit than a primary profession, as he's also a programming expert and professor, which I think is his primary occupation) and he may have some insights to offer in this matter, as he's a very insightful and helpful person . . . :wink:
 
Last edited:
If is a part that is supposed to adhere to a standards (or at least a technical specification) the answer would be to look up what it says in the standards document.
Yes, you can make assumption just based on the accuracy they are requesting, but the full standard is also likely to specify exactly HOW the measurement should be done and in what environment (in this case temperature would probably play a role).
Also, don't forget about the traceability of the instruments.
 
Last edited:
I realize your valve stem is more of an example, but... The important feature of the valve stem diameter is the clearance between the stem and the valve guide. Typical required clearance is on the order of a thousandths of an inch; say 0.001 inch or 0.025 mm. Nowhere near the 0.001 mm discussed. As @f95toli said, a diameter spec to 0.001 mm (0.00004 inch) would need a corresponding temperature value (holding the valve in you hand would warm the valve and change the diameter by more than 0.001 mm).
 
When no other tolerances are provided, the https://en.wikipedia.org/w/index.php?title=Machining_industry&action=edit&redlink=1 uses the following standard tolerances:[3][4]

1 decimal place(.x):±0.2"
2 decimal places(.0x):±0.01"
3 decimal places(.00x):±0.005"
4 decimal places(.000x):±0.0005"
See https://en.wikipedia.org/wiki/Engineering_tolerance

So it would seem that for your case the tolerance is 0.005mm.
 
  • Informative
  • Like
Likes sysprog and berkeman

Similar threads

  • · Replies 22 ·
Replies
22
Views
6K
  • · Replies 10 ·
Replies
10
Views
8K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 62 ·
3
Replies
62
Views
10K
  • · Replies 8 ·
Replies
8
Views
6K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 1 ·
Replies
1
Views
4K
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
8K