Hi, I'm having a lot of trouble trying to set up a process for calibrating turbine flowmeters for use in fuel rigs. Basically, I have a calibrated "master" flowmeter in series with the uncalibrated turbine flowmeter. I need a process that will give me the line of best fit across the flowmeters range, and also state its uncertainty to a given confidence level (probably 95%) At the minute, our process isnt very good. We take 10 points going up the range and 10 points down, then take the line of best fit using the least squares method, which gives you your gain and offset for calibration. Then we say, if any point is more than 0.5% of full scale from the line, the calibration has failed. We take outliers into account by saying you are allowed 2 points between 0.5% and 1%, but not at the top and bottom limits and not consecutive. I think we should be using the grubb outlier method. What I think I have to do is find the mean and SD for each value up the range, then use the worst case SD to calculate the uncertainty. This needs to be carried out by shop floor personnel, so cant be too complicated. We are using labview to capture the data, plot the line and show "pass" or "fail" Help me out guys!