RTD sensors calibration equipment

AI Thread Summary
The discussion centers on the calibration of RTD sensors, specifically the Pt100 type, in a mass spectrometry lab. The user plans to calibrate one sensor at a well-equipped lab to create a reference for further calibrations. They discuss using fixed points like ice baths and boiling water for calibration, noting that PT100 sensors are linear and reproducible within 0 to 100 degrees Celsius. The importance of allowing sufficient time for temperature stabilization during calibration is emphasized, along with the need for a stable setup to ensure accurate measurements. Overall, the user is seeking practical solutions for effective RTD sensor calibration within their lab constraints.
Tomcat
Messages
5
Reaction score
0
Hello !

This is my first post on the Physics Forums. I am electronics technician employed in mass spectrometry laboratory at RBI Institute, Zagreb, Croatia.

My question is related to calibration of RTD sensors, which I must do in another lab or division, and since I developed prototype of RTD based neasuring device (based on 0.1%, class B RTD Pt100 sensor) there is possiblity that I need to make more of these. Is there possibility of construction of RTD calibrating equipment ? Is there reliable design of calibration device or cell which can be done, keeping in mind that I am working in chemistry lab but I am, regarding of expenses and equipment for this project, confined into my electronics work room.

Thank you !

Tom
 
Engineering news on Phys.org
For which temperature range?

I recently discussed something similar with people who work in this area. PT100 sensors are -from what I understand- extremely linear and reproducible so you can get away with using a couple of fix points and the interpolate. If you are working at room temperature this means and ice-bath for 0 degrees C and then boiling water for 100 degrees C. The only intermediate point is the melting point of gallium, which is probably not practical in a normal lab setting.

That said, from a practical point of view you can just as well just get ONE properly calibrated sensor and then -using a water bath- use it to calibrate the rest of you sensors.
 
f95toli said:
For which temperature range?

I recently discussed something similar with people who work in this area. PT100 sensors are -from what I understand- extremely linear and reproducible so you can get away with using a couple of fix points and the interpolate. If you are working at room temperature this means and ice-bath for 0 degrees C and then boiling water for 100 degrees C. The only intermediate point is the melting point of gallium, which is probably not practical in a normal lab setting.

That said, from a practical point of view you can just as well just get ONE properly calibrated sensor and then -using a water bath- use it to calibrate the rest of you sensors.


Thank you for you answer. Well, this is actually what I discussed with my colleagues and I am glad you have the same opinion - we will calibrate ONE sensor on our Institute of Physics (they have equipment and standards). We could calibrate according to their standard RTDs and by doing this make our own reference sensor.

Now, since we won't go outside of 100 degrees C (and this is the limit below which is Pt100 considered perfectly linear), we currently use only this formula in our microcontroller to "fit" data to the RTD characteristics curve:

http://www.mosaic-industries.com/em...inum-rtd-sensors/resistance-calibration-table

T = (R/R0 – 1)/α - 0.19

Of course, this works in case that all RTD's fit to this curve. So, if I obtain class A RTD (Pt100), and this implies accuracy of 0.04 degrees C, fit data from ADC according to above expression, I should get excellent accuracy. IF you don't agree, please correct me. :)

But, I will follow your advice and use ice bath to check RTD and after that put it to the calibration in equipped lab.

Obviously, calibration equipment making is too much.

Thanks !

Tom
 
When performing temperature calibrations while working for a pharmaceutical company where accuracy and traceability to NIST was a FDA requirement, I used this http://www.kayeinstruments.com/validationproducts/irtd.htm as a standard. It is quite expensive however.

http://www.kayeinstruments.com/img/cd_calibration_irtd2.jpg

For a temperature reference, I used one of these http://www.kayeinstruments.com/validationproducts/drywell.htm.

http://www.kayeinstruments.com/img/cd_calibration_irtd_2.jpg
 
Last edited by a moderator:
dlgoff said:
When performing temperature calibrations while working for a pharmaceutical company where accuracy and traceability to NIST was a FDA requirement, I used this http://www.kayeinstruments.com/validationproducts/irtd.htm as a standard. It is quite expensive however.

http://www.kayeinstruments.com/img/cd_calibration_irtd2.jpg

For a temperature reference, I used one of these http://www.kayeinstruments.com/validationproducts/drywell.htm.

http://www.kayeinstruments.com/img/cd_calibration_irtd_2.jpg

Thank you very much for this info. These are obviously excellent instrumets. This standard RTD probe from GE would be the most suitable for our purpose. Even if it is expensive, now I've got good reference for further search.

Tom
 
Last edited by a moderator:
f95toli said:
For which temperature range?

I recently discussed something similar with people who work in this area. PT100 sensors are -from what I understand- extremely linear and reproducible so you can get away with using a couple of fix points and the interpolate. If you are working at room temperature this means and ice-bath for 0 degrees C and then boiling water for 100 degrees C. The only intermediate point is the melting point of gallium, which is probably not practical in a normal lab setting.

That said, from a practical point of view you can just as well just get ONE properly calibrated sensor and then -using a water bath- use it to calibrate the rest of you sensors.

Hello.

I just did ice bath measurement, using small dewar flask, but due to the nature of the sensor, e.g. it's proximity to the PCB, I was not able to do constant stirring of the ice/water mixture. Although results show some instability, there was period of very nice distributed values from 0.12 to 0.15 degrees C and that was most certainly period of ideal mixture behaviour (other results were very erratic and these good values are measured immeditely after mixture stirring). So, I guess, sensor is checked, now we need to do some fine tuning in better equipped lab.

Thank you for your help.



Tom
 
Tomcat said:
Although results show some instability, there was period of very nice distributed values from 0.12 to 0.15 degrees C ...
You need to allow plenty of time for the temperature source equilibration after you reach the desired value (using a dry block or similar controlled source). I would allow at least 30 minutes to guarantee stability at each calibration point. Here again, I talking about using the source and standard that I referenced.
 
dlgoff said:
You need to allow plenty of time for the temperature source equilibration after you reach the desired value (using a dry block or similar controlled source). I would allow at least 30 minutes to guarantee stability at each calibration point. Here again, I talking about using the source and standard that I referenced.

You are right, I'll repeat the procedure. Of course, I'll need to make some adjustment (fix RTD and electronics on laboratory stand as well as make system to slowly stir the ice bath mixture while measuring). Problem is that RTD is intended to be very close to electronics circuit board (cca 10mm), RTD itself is very small, so immersion shuold be done very carefully. So, that is the next step - making fixed setup, let it stabilize and log data all the time.

Tom
 
Back
Top