Instrument Calibration Intervals: Self-Calibration Accepted?

AI Thread Summary
Calibration intervals for laboratory instruments, such as DMMs and oscilloscopes, are typically set at one year to balance the risks of over-calibration and under-calibration, as components can degrade significantly within that time. Self-calibration using a higher accuracy bench DMM is generally not accepted in most industries, as businesses must provide proof of calibration from certified labs. The need for traceability to international standards is crucial, particularly in regulated fields like pharmaceuticals and food safety. While some applications may allow for less stringent checks, most industries require documented calibration certificates for compliance. Overall, maintaining accurate calibration is essential for ensuring instrument reliability and adherence to industry standards.
likephysics
Messages
638
Reaction score
4
Just out of curiosity why is the calibration interval 1yr for lab instruments like DMM, scope etc.
Why not 2 years.

Also, you use a higher accuracy bench DMM to calibrate a hand held DMM.

Instead of getting calibration done by an outside agency, Why not calibrate the handheld DMM yourself using an already calibrated bench DMM.
I am trying to ask if "self-calibration" is accepted in the industry.
 
Engineering news on Phys.org
Calibration is normally done using very high precision signal, voltage , current sources etc ( depending on the instrument being calibrated.
This is why it generally costs lots of money to get signal generators, spectrum analysers, oscilloscopes etc calibrated by a certified calibration lab.

I guess a 1 yr period was chosen as a tradeoff between calibrating too often and not enough
Components can age a lot in 12 months time and cause the calibration to become very poor
and as the instrument gets older the components will likely to age even faster

Instead of getting calibration done by an outside agency, Why not calibrate the handheld DMM yourself using an already calibrated bench DMM.
I am trying to ask if "self-calibration" is accepted in the industry.

generally ... no it wouldn't, very few labs would have the precision test equip as I mentioned above. Any business that owns instruments that need regular calibrating will be required to show proof of an independently done calibration certificate

I'm a certified Trimble service technician and the company I work for I do calibration certificates on optical surveying equip. for our customers. Those customers can be required to produce a cal cert to a project manager that they may be contracting to.

Dave
 
likephysics said:
Just out of curiosity why is the calibration interval 1yr for lab instruments like DMM, scope etc.
Why not 2 years.

Also, you use a higher accuracy bench DMM to calibrate a hand held DMM.

Instead of getting calibration done by an outside agency, Why not calibrate the handheld DMM yourself using an already calibrated bench DMM.
I am trying to ask if "self-calibration" is accepted in the industry.

It depends upon what 'the industry' needs, for a particular application. It has to depend on the particular circumstances. Sometimes, it is quite sufficient to check that results from two locally available instruments agree to within the particular required limits. You wouldn't often need to check the Mains Volts to 0.1% accuracy, for instance.
 
Equipment calibration is a huge issue when it comes to the Food & Drug Administration.

http://www.fda.gov/ScienceResearch/FieldScience/LaboratoryManual/ucm171880.htm

This procedure specifies the schedule and requirements for maintenance, performance, calibration, and verification of laboratory testing equipment. Meeting the criteria in this procedure demonstrates control of the maintenance and calibration parameters needed to achieve the accuracy of instruments used for analytical testing.

http://www.fda.gov/ScienceResearch/FieldScience/ucm171821.htm

The program for calibration of equipment demands that calibrations and measurements made by the laboratory are traceable to the International System of Units.
 
sophiecentaur said:
It depends upon what 'the industry' needs, for a particular application. It has to depend on the particular circumstances. Sometimes, it is quite sufficient to check that results from two locally available instruments agree to within the particular required limits. You wouldn't often need to check the Mains Volts to 0.1% accuracy, for instance.

Yes.

I was the calibration technician for a manufacturing facility a few decades ago. I did calibrations for electrical test instruments including voltmeters, wattmeters, power analyzers and many more. We had what we called the "in house standards", which were only used for calibrating the other instruments. The "in house standards" were generally of higher stability and accuracy than the normally used instruments. They "the in house standards" were calibrated once every six months by an outside agency. Or it may have been once a year for some instruments. My memory fails me. A calibration certificate was also issued by the outside agency for proof of traceability for each "in house standard". The frequency of calibration for the instruments used in manufacturing varied, but most of them were every month. A log was maintained for each manufacturing instrument and a sticker was placed on it after it was checked.

However, like sophiecentaur said, all of this may vary depending on the industry. The most important thing, and the thing that most all industries will have in common, is the traceability requirement.
 
Hi all I have some confusion about piezoelectrical sensors combination. If i have three acoustic piezoelectrical sensors (with same receive sensitivity in dB ref V/1uPa) placed at specific distance, these sensors receive acoustic signal from a sound source placed at far field distance (Plane Wave) and from broadside. I receive output of these sensors through individual preamplifiers, add them through hardware like summer circuit adder or in software after digitization and in this way got an...
I am not an electrical engineering student, but a lowly apprentice electrician. I learn both on the job and also take classes for my apprenticeship. I recently wired my first transformer and I understand that the neutral and ground are bonded together in the transformer or in the service. What I don't understand is, if the neutral is a current carrying conductor, which is then bonded to the ground conductor, why does current only flow back to its source and not on the ground path...
While I was rolling out a shielded cable, a though came to my mind - what happens to the current flow in the cable if there came a short between the wire and the shield in both ends of the cable? For simplicity, lets assume a 1-wire copper wire wrapped in an aluminum shield. The wire and the shield has the same cross section area. There are insulating material between them, and in both ends there is a short between them. My first thought, the total resistance of the cable would be reduced...

Similar threads

Back
Top