Find Function/Transform for signal that minimizes CV of data

AI Thread Summary
The discussion focuses on finding a transformation function "F" for measured values in a dataset to minimize the coefficient of variation (CV) of calculated ratios (R) from .csv files. The goal is to achieve a CV of less than 3%, ideally under 1%, while ensuring that a plot of R values against derived values (VF) shows no discernible trend. Previous attempts, including a square root transformation, have not met the CV requirement, prompting the need for more complex, flexible nonlinear transformations. Suggestions include exploring various mathematical functions such as polynomial, logarithmic, and exponential forms that can provide tunable parameters. The problem emphasizes the importance of avoiding overfitting while ensuring the transformation generalizes well to larger datasets.
johnpjust
Messages
22
Reaction score
0
Warning...this requires scripting and iteration, and is not theoretical -- it is a real problem I haven't been able to solve, but I'm sure someone here can... :-)

Data: each .csv file is a test recorded at a time interval of 7.5Hz and each file has 3 columns. The first column is time in seconds, the second column is a multiplier (see formula below), and third column is the measured value (to be "transformed"). There is also a corresponding value for each log in the "W.csv" file.

Formula (to produce a value for each file): R = [W_log_Val] / [#.log_val] -->
  • W_log_Val is the corresponding value for that file located in the "W.csv" file.
  • the #.log_val = ∑(col2_val)*(F(col3_val)) (a summation over the rows in the file)
  • F(col3_val) is the function/transformation of the measured value to be found
Goal: A function/transform "F" that can be applied on the variable recorded in the third column ("col3_val") such that the coefficient of variation of all the "R" values is minimized. The CV should definitely be less than 3%, but I expect a good solution could easily make it less than 1%.

Additional Constraint: A plot of the R values against VF values should show no trend/pattern, where -->
  • VF = [#.log_val]/[Ts]
  • "Ts" = the total time for each log (i.e., the value in the last row of the first column for each log)
See an example of the trend when no transform is applied in attached jpg.

Example:
applying a SQRT transformation to the measured value "col3_val" helps significantly, but the CV is still around 8-9% and does not satisfy the constraint.

See example of trend after applying sqrt in attached PDF
 

Attachments

  • data.zip
    data.zip
    71.7 KB · Views: 458
  • Fit R by VF.jpg
    Fit R by VF.jpg
    7.6 KB · Views: 474
  • sqrtPlot.pdf
    sqrtPlot.pdf
    34.3 KB · Views: 281
Last edited:
Physics news on Phys.org
Without additional constraints the problem will have a mathematical solution you clearly don't want: some weird jumping function that works exactly with the dataset used to produce it (and nothing else), based on tuning the function value for specific col3_val appearing in your dataset.

To generalize the sqrt attempt, you can take the nth power of the values and see which n works best (for real n).

The unchanged function looks close to $$ R \approx \frac{1}{VF} = \frac{Ts}{[\#.log\_val]}$$The dependence on #.log_val follows its definition: $$R=\frac{[W\_log\_Val]}{ [\#.log\_val]}$$ which is suspicious.
 
Thanks for your response.

I've tried the obvious stuff already (including a sweep over n powers). While I agree with your implicit point (that one does not want to over-fit the data) in trying to generalize the fit, I'm at the point where I need to "step up" to another level of complexity because I know this same issue exists in a larger data set. I'm not sure if that means a sigmoidal function or something similar, but my experience with the problem tells me if someone knows of a more "flexible" nonlinear transformation (more tune-able parameters that still produce monotonically increasing values post-transformation), then I think a solution is possible that will generalize well to the larger data set.

Also, I do realize the stipulation/constraint I have looks somewhat suspicious at first, but it does have physical meaning and the added "W_log_Val" term is what makes it an independent term (so it turns out OK).
 
- (x+c)^n
- log(x+c)
- e^(cx)
- atan(x/c + d)
- arbitrary, scaled sums of the options above (including n=0 which gives a constant)
- sine, cosine, if there is a motivation to expect oscillations
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Replies
21
Views
4K
Replies
13
Views
3K
Replies
2
Views
3K
Replies
2
Views
3K
Back
Top