Hi all, Preface: So I have run several series of dynamic torque tests on different size 1/4 turn Butterfly valves. In the test, I am measuring upstream pressure, downstream pressure, valve angle (90° is open, 0 is closed), and torque on a transducer. I know the flow rate, which is in GPM. I am using LabVIEW's SignalExpress to import this data and put it in excel tables. From the upstream and downstream pressure, i can calculate the pressure difference. Then from this and the flow rate, i can calculate the flow coefficient, Cv. LabVIEW is sampling 1,000 samples at 1,000Hz (A sample every millisecond.) Also...the substance is water at room temperature. Problem: Upon graphing Valve Angle on the x-axis and Cv on the y-axis it is obvious that I have errors. It seems as if my upstream pressure sensor goes to 0 randomly when the valve is open. Also, both pressure sensors do not give a smooth transition or linear data and my torque transducer gives huge errors at random points. This can all be seen on the attached pic and this troubles me because all the DAQ tools are calibrated to NIST standards. Does anyone know how to eliminate these fluctuations on the graphs? Attempt at a Solution My main goal is to have accurate position vs Cv data. The only thing I can think of is to decrease the sampling rate to 1 data point every 0.1 seconds. Also, I may hold the valve every 10° to ensure the pressures have stabalized. Thanks in advanced!