How to select the required accuracy of an instrument?

  • Thread starter fonz
  • Start date
  • #1
147
5
Summary:
Instrument accuracy and precision
If I need to make a measurement to check compliance e.g. measuring component dimensions. How do I know what accuracy is required for the measuring instrument if the tolerance is not specified?

For example, during an engine rebuild, the manufacturer specifies the valve stem diameter to be 5.973mm. Clearly the instrument needs a resolution of at least 0.001mm, but is that enough?

I am aware that in most cases, where a tolerance is specified it is typical to aim for at Test Accuracy Ratio (TAR) of 4:1. In this case the tolerance is not specified so is it correct to assume the tolerance is +/-0.0005mm? Therefore the instrument must have an accuracy of at least +/-0.000125mm?

Thanks
 

Answers and Replies

  • #2
Tom.G
Science Advisor
4,061
2,755
Paging @Ranger Mike, he probably has more direct knowledge than the below.

This will probably answer more questions than you thought existed!


GUM: Guide to the Expression of Uncertainty in Measurement


https://www.bipm.org/documents/2012...f-3f85-4dcd86f77bd6?version=1.7&download=true

(above found with:
https://www.google.com/search?&q=G.U.M.+guide+to+measurement)

Have fun, it's 'only' 134 pages.

Cheers,
Tom

p.s. It has been years since I read that document, but your accuracy assumption seems reasonable for a 'failsafe', can't tolerate ANY failure, situation.
 
  • #3
2,145
1,340
There may be more tolerance in one direction than in the other, too ##-## e.g. a machine screw that's a tiny bit too small could be tolerably loose for its intended purpose, but one that's by the same amount a bit too big might just plain not fit without damaging the threads ##-## I think that for engine rebuild purposes, it would be helpful if the spec sheet were to state the absolute and recommended tolerances, along with the target value ##-## @Mark44 has a great deal of actual experience in rebuilding engines ##-## e.g. see this thread (I think that for him that's more of a personal pursuit than a primary profession, as he's also a programming expert and professor, which I think is his primary occupation) and he may have some insights to offer in this matter, as he's a very insightful and helpful person . . . :wink:
 
Last edited:
  • #4
f95toli
Science Advisor
Gold Member
3,234
727
If is a part that is supposed to adhere to a standards (or at least a technical specification) the answer would be to look up what it says in the standards document.
Yes, you can make assumption just based on the accuracy they are requesting, but the full standard is also likely to specify exactly HOW the measurement should be done and in what environment (in this case temperature would probably play a role).
Also, don't forget about the traceability of the instruments.
 
Last edited:
  • #5
gmax137
Science Advisor
2,004
1,376
I realize your valve stem is more of an example, but... The important feature of the valve stem diameter is the clearance between the stem and the valve guide. Typical required clearance is on the order of a thousandths of an inch; say 0.001 inch or 0.025 mm. Nowhere near the 0.001 mm discussed. As @f95toli said, a diameter spec to 0.001 mm (0.00004 inch) would need a corresponding temperature value (holding the valve in you hand would warm the valve and change the diameter by more than 0.001 mm).
 
  • #6
gleem
Science Advisor
Education Advisor
1,846
1,224
  • Informative
  • Like
Likes sysprog and berkeman

Related Threads on How to select the required accuracy of an instrument?

Replies
17
Views
287
Replies
1
Views
2K
Replies
18
Views
11K
  • Last Post
Replies
4
Views
1K
Replies
1
Views
2K
  • Last Post
Replies
6
Views
4K
Replies
20
Views
14K
  • Last Post
Replies
1
Views
3K
Top