Troubleshooting Turbine Flowmeter Calibration

Click For Summary
SUMMARY

The discussion focuses on calibrating turbine flowmeters for fuel rigs, specifically using a calibrated "master" flowmeter in series with an uncalibrated turbine flowmeter. The current calibration process involves taking 20 data points (10 ascending and 10 descending) and applying the least squares method to determine gain and offset. The user seeks to improve the process by incorporating the Grubb outlier method and calculating uncertainty using the standard deviation (SD) of the readings. The calibration must remain straightforward for shop floor personnel, utilizing LabVIEW for data capture and analysis.

PREREQUISITES
  • Understanding of turbine flowmeter calibration processes
  • Familiarity with least squares regression analysis
  • Knowledge of the Grubb outlier detection method
  • Experience with LabVIEW for data acquisition and plotting
NEXT STEPS
  • Research the Grubb outlier method for identifying outliers in calibration data
  • Learn how to calculate uncertainty using standard deviation in calibration processes
  • Explore advanced calibration techniques for turbine flowmeters
  • Investigate best practices for using LabVIEW in flowmeter data analysis
USEFUL FOR

Calibration engineers, quality assurance personnel, and technicians involved in flowmeter calibration and performance optimization in industrial settings.

robsmith82
Messages
17
Reaction score
0
Hi,

I'm having a lot of trouble trying to set up a process for calibrating turbine flowmeters for use in fuel rigs.

Basically, I have a calibrated "master" flowmeter in series with the uncalibrated turbine flowmeter. I need a process that will give me the line of best fit across the flowmeters range, and also state its uncertainty to a given confidence level (probably 95%)

At the minute, our process isn't very good. We take 10 points going up the range and 10 points down, then take the line of best fit using the least squares method, which gives you your gain and offset for calibration. Then we say, if any point is more than 0.5% of full scale from the line, the calibration has failed. We take outliers into account by saying you are allowed 2 points between 0.5% and 1%, but not at the top and bottom limits and not consecutive. I think we should be using the grubb outlier method.

What I think I have to do is find the mean and SD for each value up the range, then use the worst case SD to calculate the uncertainty. This needs to be carried out by shop floor personnel, so can't be too complicated. We are using labview to capture the data, plot the line and show "pass" or "fail"

Help me out guys!
 
Engineering news on Phys.org


A 10 point cal is pretty standard for most flow meters. I do believe that you can go so far as to request viscosity calibrations as well. What exactly is it that you are having the trouble with?

We use Cox flow meters (among others). They have a nice cal set up for their meters.
http://www.cox-instruments.com/calibration.html
 
Last edited by a moderator:


What I'm really struggling with is whether I need to do the same point a number of times to get the mean and SD for that point to be able to find and state the uncertainty, or whether I just take one reading per point.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 22 ·
Replies
22
Views
2K
Replies
10
Views
4K
  • · Replies 8 ·
Replies
8
Views
6K
Replies
6
Views
7K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
Replies
11
Views
4K