- #1
pamparana
- 128
- 0
Hello everyone,
I have a bit of an issue regarding scaling of an expression. So, the scenario is as follows.
I have a confidence value that can be associated with the solution given by an optimization routine and it is as follows:
C = exp(-A)/(exp(-A) + exp(-B))
where A, B and C are some energy values returned by the optimization routine and C represents the confidence or the probability assigned to the solution A. Also, B is always greater than A.
After some simple manipulation, the expression becomes:
C = 1.0 / (1 + exp(A-B))
Now, in the beginning my issue was that the values A, B and C were usually quite large (in tens of thousands). So this expression was giving values of 0.5 when A and B were very close and when the difference was something a bit larger (in absolute numbers), then the expression would basically become 1.
So, I realized I needed to do some normalization and the first thing I tried was divide everything by A. So, now the expression becomes:
C = 1.0 / (1 + exp(1-B/A))
Now typically B/A is something from 1 to 1.01. So, now I have a similar problem: As exp(1-B/A) will basically be 0.5.
So, what I would like to do is introduce some scaling, normalization on this expression that would help me basically capture the changes in my data range. I would be grateful for any suggestions that anyone might have.
Thanks,
Luca
I have a bit of an issue regarding scaling of an expression. So, the scenario is as follows.
I have a confidence value that can be associated with the solution given by an optimization routine and it is as follows:
C = exp(-A)/(exp(-A) + exp(-B))
where A, B and C are some energy values returned by the optimization routine and C represents the confidence or the probability assigned to the solution A. Also, B is always greater than A.
After some simple manipulation, the expression becomes:
C = 1.0 / (1 + exp(A-B))
Now, in the beginning my issue was that the values A, B and C were usually quite large (in tens of thousands). So this expression was giving values of 0.5 when A and B were very close and when the difference was something a bit larger (in absolute numbers), then the expression would basically become 1.
So, I realized I needed to do some normalization and the first thing I tried was divide everything by A. So, now the expression becomes:
C = 1.0 / (1 + exp(1-B/A))
Now typically B/A is something from 1 to 1.01. So, now I have a similar problem: As exp(1-B/A) will basically be 0.5.
So, what I would like to do is introduce some scaling, normalization on this expression that would help me basically capture the changes in my data range. I would be grateful for any suggestions that anyone might have.
Thanks,
Luca