- #1
Watts
- 38
- 0
The following is crude derivation demonstrating how a distribution such as the normal distribution is simply one distribution
that stems from a family of similar distributions. I originally was going to post this in the new Independent Research forum
but the moderator thought it was better suited to be posted here instead. I am looking for feedback and thoughts from viewers
in this forum. I would like to find someone to help me coauthor a paper on this subject. My technical writing skills aren’t
that great and I don’t claim to be a professional mathematician only a recreational mathematician. I do this stuff for fun. I
am not an academic nor a professor so I am not under the gun to have papers published on a routine basis. I have a lot more
work done than what I have showed here. The intent here also is to show how several common distributions can be manipulated
into a form that contains parameters such as variance and mean. Very few distributions exist that require the parameters mean
and variance. Most distributions are in a form that contains adjustment constants. From a practical perspective this is
insufficient due to the fact that generic constants require a certain amount of trial and error to adjust a distribution to
fit a certain data set. Distributions such as the normal distribution only require the parameters mean and variance. The mean
and variance are easily obtained from a data set.
Derivation of Normal Intrinsic Distribution
From the equation
[itex]\frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }} = \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} [/itex]
the normal distribution can be written in the form given here.
[itex]P(q) = \frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }} \cdot e^{ - \frac{1}{{2 \cdot \sigma ^2 }} \cdot (q - \mu )^2 }
= \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} \cdot e^{ - \pi \cdot \left( {\sqrt {\frac{1}{{2 \cdot \pi \cdot
\sigma ^2 }}} } \right)^2 \cdot (q - \mu )^2 } [/itex]
By letting
[itex]P_{q_1 } = \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} [/itex]
the normal distribution takes the form.
[itex]P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 } [/itex]
Using the integral given here
[itex]\alpha = \int\limits_{ - \infty }^\infty {e^{ - x^{2 \cdot k} } dx} = \frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot
k}}),k = 1,2,3,...,\infty[/itex]
and evaluating k=1 produces the integral.
[itex]\int\limits_{ - \infty }^\infty {e^{ - q^2 } dq} = \sqrt \pi [/itex]
The relationship between the constant pi and the integral can be seen
[itex]P(q) = P{}_q \cdot e^{ - (\sqrt \pi \cdot P_q )^2 \cdot (q - \mu )^2 } = P_{q_1 } \cdot e^{ - \pi
\cdot\left(P_{q_1}\right)^{2} \cdot (q - \mu )^2 } [/itex]
Multiplying both sides of the unevaluated version of the integral and substituting the solution.
[itex]N\cdot \alpha = N\cdot \int\limits_{ - \infty }^\infty {e^{ - x^{2 \cdot k} } dx} = N\cdot \frac{1}{k} \cdot \Gamma
(\frac{1}{{2 \cdot k}}),k = 1,2,3,...,\infty[/itex]
Produces the equation
[itex]P(q) = P_q \cdot e^{ - \left[ {\frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot k}}) \cdot N \cdot P_q (q - \mu )}
\right]^{2 \cdot k} } ,k = 1,2,3,...,\infty . [/itex]
For the case N=1 and [itex]k = 1,2,3,...,\infty[/itex]
[itex]\int\limits_{ - \infty }^\infty {P(q)dq = 1} [/itex]
and for the case N=2 and [itex]k = 1,2,3,...,\infty[/itex]
[itex]\int\limits_{ - \infty }^\infty {P(q)dq = \frac{1}{2}} [/itex]
hence the Intrinsic Distribution can be written below for all cases of [itex]N=1,2,3,...,\infty[/itex]
[itex]P(q) = \sum\limits_{i = 1}^N {P_{q_i } \cdot e^{ - \left[ {\frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot k}}) \cdot N
\cdot P_{q_i } (q - q_i )} \right]^{2 \cdot k} } }= \sum\limits_{i = 1}^N {P_{q_i } \cdot e^{ - \left[ {\alpha \cdot N
\cdot P_{q_i } \cdot \left( {q - q_i } \right)} \right]^{2 \cdot k} } } ,k = 1,2,3,...,\infty \int\limits_{ - \infty
}^\infty {P(q)dq} = 1 [/itex]
Derivation of Normal Distribution
Fore the case N=1 and k=1
[itex]P(q) = \sum\limits_{i = 1}^N {P_{q_i } \cdot e^{ - \left[ {\alpha \cdot N \cdot P_{q_i } \cdot \left( {q - q_i }
\right)} \right]^{2 \cdot k} } } ,k = 1,2,3,...,\infty [/itex].
The equation is reduced to.
[itex]P(q) = P_{q_1 } \cdot e^{ - \left(\alpha \cdot P_{q_1 }\right)^{2} \cdot (q - q_1 )^2 } [/itex]
Using the integral
[itex]\alpha = \int\limits_{ - \infty }^\infty {e^{ - x^{2 \cdot k} } dx} = \frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot
k}}),k = 1,2,3,...,\infty [/itex]
the equation takes the form.
[itex] P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 }[/itex]
Using the equation
[itex]\mu = \int\limits_{ - \infty }^\infty {P(q) \cdot q \cdot dq = q_1 } [/itex]
it can be seen that
[itex]P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 } [/itex]
and from
[itex]\sigma ^2 = \int\limits_{ - \infty }^\infty {P(q) \cdot (q - \mu )^2 \cdot dq} [/itex]
the following table is generated.
[itex]\begin{array}{*{20}c}
{P_{q_1 } = 1} & {\sigma ^2 = \frac{1}{{2 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 1^2 \cdot \pi }}} \\
{P_{q_1 } = 2} & {\sigma ^2 = \frac{1}{{8 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 2^2 \cdot \pi }}} \\
{P_{q_1 } = 3} & {\sigma ^2 = \frac{1}{{18 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 3^2 \cdot \pi }}} \\
{P_{q_1 } = 4} & {\sigma ^2 = \frac{1}{{32 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 4^2 \cdot \pi }}} \\
{P_{q_1 } = P_{q_1 } } & \Rightarrow & {\sigma ^2 = \frac{1}{{2 \cdot P_{q_1 } ^2 \cdot \pi }}} \\
\end{array} [/itex]
From this table the equation is found.
[itex]\sigma ^2 = \frac{1}{{2 \cdot \pi \cdot (P_{q_1 } )^2 }} [/itex]
Solving the equation for
[itex]P_{q_1 } = \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} = \frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }}
[/itex]
and substituting into the equation
[itex] P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 } [/itex]
produces the normal distribution.
[itex]P(q) = \frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }} \cdot e^{ - \frac{1}{{2 \cdot \sigma ^2 }} \cdot \left( {q -
\mu } \right){}^2} [/itex]
Multiple distributions can be generated in the like manor from multiple values of k. You can also generate multi modal
distributions from different values of N. For example for N=2 and k=1 a true bimodal normal distrubtion could be generated.
You do not have to be restricted to gausian distributions only. By using the same methods as above you could generate several
different types of Intrinsic Distribution. An example shown here is the Cauchy Intrinsic distribution
[itex]P(q) = \sum\limits_{i = 1}^N {\frac{{P_{q_i } }}{{((\left( {q - q_i } \right) \cdot P_{q_i } \cdot N \cdot \varsigma
)^{2 \cdot k} + 1)^k }}} ,\varsigma = \int\limits_{ - \infty }^\infty {(\frac{1}{{(x^{2 \cdot k} + 1)^k }})} dx =
\frac{1}{k} \cdot \beta (\frac{1}{{2 \cdot k}},\frac{{2 \cdot k^2 - 1}}{{2 \cdot k}}),k = 1,2,3,...,\infty [/itex]
Many different distributions are incomplete and can be finished by putting into a form that contains measurable quantities
such as mean and variance. You can use the same method shown above to complete many different distributions such as the
logistic distribution shown here. The Logistic Distribution typically takes the form
[itex]P(q) = \frac{{e^{ - (q - m)/b} }}{{b \cdot \left[ {1 + e^{ - (q - m)/b} } \right]^2 }} [/itex]
and its distribution function
[itex]D(q) = \frac{1}{{1 + e^{ - (q - m)/b} }} [/itex]
The complete form is
[itex]P(q) = 4 \cdot \sqrt {\frac{{\pi ^2 }}{{\sigma ^2 \cdot 48}}} \cdot \left( {\frac{{e^{\left( {\left( {q - \mu }
\right) \cdot 4 \cdot \sqrt {\frac{{\pi ^2 }}{{\sigma ^2 \cdot 48}}} } \right)} }}{{\left( {1 + e^{\left( {\left( {q - \mu }
\right) \cdot 4 \cdot \sqrt {\frac{{\pi ^2 }}{{\sigma ^2 \cdot 48}}} } \right)} } \right)^2 }}} \right) [/itex]
and its complete distribution function
[itex] D(x) = \frac{1}{2} - \frac{1}{{\left( {1 + e^{\left( {\frac{1}{3} \cdot \left( {q - \mu } \right) \cdot \sqrt 3
\cdot \pi \cdot \sqrt {\frac{1}{{\sigma ^2 }}} } \right)} } \right)}}[/itex]
Thousands of different distributions that have never been seen or studied before can be generated or created using similar
techniques shown as above. An example such as this is the Hyperbolic Distribution below.
[itex]P(q) ={\sqrt {\frac{{\pi ^2 }}{{48^2 \cdot \sigma ^2 }}} } \cdot \cosh (2 \cdot \sqrt {\frac{{\pi ^2 }}{{48^2 \cdot
\sigma ^2 }}} \cdot (q - \mu )^{ - 2} ) [/itex]
Multivariate Intrinsic Distributions are also achievable. The Multivariate Cauchy Distribution is shown below.
[itex]P(q_1 ,q_2 ,q{}_3,...,q_j ) = \prod\limits_{j = 1}^m {\sum\limits_{i = 1}^N {\frac{{P_{q_{i,j} }
}}{{((N^{\frac{1}{m}} \cdot \varsigma \cdot (q_j - q_{i,j} ) \cdot P_{q_{i,j} } )^{2 \cdot k} + 1)^k }}(q_j - q_{i,j} )}
},k = 1,2,3,...,\infty [/itex]
[itex] \varsigma = \int\limits_{ - \infty }^\infty {(\frac{1}{{(x^{2 \cdot k} + 1)^k }})} dx = \frac{1}{k} \cdot \beta
(\frac{1}{{2 \cdot k}},\frac{{2 \cdot k^2 - 1}}{{2 \cdot k}}),k = 1,2,3,...,\infty [/itex]
that stems from a family of similar distributions. I originally was going to post this in the new Independent Research forum
but the moderator thought it was better suited to be posted here instead. I am looking for feedback and thoughts from viewers
in this forum. I would like to find someone to help me coauthor a paper on this subject. My technical writing skills aren’t
that great and I don’t claim to be a professional mathematician only a recreational mathematician. I do this stuff for fun. I
am not an academic nor a professor so I am not under the gun to have papers published on a routine basis. I have a lot more
work done than what I have showed here. The intent here also is to show how several common distributions can be manipulated
into a form that contains parameters such as variance and mean. Very few distributions exist that require the parameters mean
and variance. Most distributions are in a form that contains adjustment constants. From a practical perspective this is
insufficient due to the fact that generic constants require a certain amount of trial and error to adjust a distribution to
fit a certain data set. Distributions such as the normal distribution only require the parameters mean and variance. The mean
and variance are easily obtained from a data set.
Derivation of Normal Intrinsic Distribution
From the equation
[itex]\frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }} = \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} [/itex]
the normal distribution can be written in the form given here.
[itex]P(q) = \frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }} \cdot e^{ - \frac{1}{{2 \cdot \sigma ^2 }} \cdot (q - \mu )^2 }
= \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} \cdot e^{ - \pi \cdot \left( {\sqrt {\frac{1}{{2 \cdot \pi \cdot
\sigma ^2 }}} } \right)^2 \cdot (q - \mu )^2 } [/itex]
By letting
[itex]P_{q_1 } = \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} [/itex]
the normal distribution takes the form.
[itex]P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 } [/itex]
Using the integral given here
[itex]\alpha = \int\limits_{ - \infty }^\infty {e^{ - x^{2 \cdot k} } dx} = \frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot
k}}),k = 1,2,3,...,\infty[/itex]
and evaluating k=1 produces the integral.
[itex]\int\limits_{ - \infty }^\infty {e^{ - q^2 } dq} = \sqrt \pi [/itex]
The relationship between the constant pi and the integral can be seen
[itex]P(q) = P{}_q \cdot e^{ - (\sqrt \pi \cdot P_q )^2 \cdot (q - \mu )^2 } = P_{q_1 } \cdot e^{ - \pi
\cdot\left(P_{q_1}\right)^{2} \cdot (q - \mu )^2 } [/itex]
Multiplying both sides of the unevaluated version of the integral and substituting the solution.
[itex]N\cdot \alpha = N\cdot \int\limits_{ - \infty }^\infty {e^{ - x^{2 \cdot k} } dx} = N\cdot \frac{1}{k} \cdot \Gamma
(\frac{1}{{2 \cdot k}}),k = 1,2,3,...,\infty[/itex]
Produces the equation
[itex]P(q) = P_q \cdot e^{ - \left[ {\frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot k}}) \cdot N \cdot P_q (q - \mu )}
\right]^{2 \cdot k} } ,k = 1,2,3,...,\infty . [/itex]
For the case N=1 and [itex]k = 1,2,3,...,\infty[/itex]
[itex]\int\limits_{ - \infty }^\infty {P(q)dq = 1} [/itex]
and for the case N=2 and [itex]k = 1,2,3,...,\infty[/itex]
[itex]\int\limits_{ - \infty }^\infty {P(q)dq = \frac{1}{2}} [/itex]
hence the Intrinsic Distribution can be written below for all cases of [itex]N=1,2,3,...,\infty[/itex]
[itex]P(q) = \sum\limits_{i = 1}^N {P_{q_i } \cdot e^{ - \left[ {\frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot k}}) \cdot N
\cdot P_{q_i } (q - q_i )} \right]^{2 \cdot k} } }= \sum\limits_{i = 1}^N {P_{q_i } \cdot e^{ - \left[ {\alpha \cdot N
\cdot P_{q_i } \cdot \left( {q - q_i } \right)} \right]^{2 \cdot k} } } ,k = 1,2,3,...,\infty \int\limits_{ - \infty
}^\infty {P(q)dq} = 1 [/itex]
Derivation of Normal Distribution
Fore the case N=1 and k=1
[itex]P(q) = \sum\limits_{i = 1}^N {P_{q_i } \cdot e^{ - \left[ {\alpha \cdot N \cdot P_{q_i } \cdot \left( {q - q_i }
\right)} \right]^{2 \cdot k} } } ,k = 1,2,3,...,\infty [/itex].
The equation is reduced to.
[itex]P(q) = P_{q_1 } \cdot e^{ - \left(\alpha \cdot P_{q_1 }\right)^{2} \cdot (q - q_1 )^2 } [/itex]
Using the integral
[itex]\alpha = \int\limits_{ - \infty }^\infty {e^{ - x^{2 \cdot k} } dx} = \frac{1}{k} \cdot \Gamma (\frac{1}{{2 \cdot
k}}),k = 1,2,3,...,\infty [/itex]
the equation takes the form.
[itex] P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 }[/itex]
Using the equation
[itex]\mu = \int\limits_{ - \infty }^\infty {P(q) \cdot q \cdot dq = q_1 } [/itex]
it can be seen that
[itex]P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 } [/itex]
and from
[itex]\sigma ^2 = \int\limits_{ - \infty }^\infty {P(q) \cdot (q - \mu )^2 \cdot dq} [/itex]
the following table is generated.
[itex]\begin{array}{*{20}c}
{P_{q_1 } = 1} & {\sigma ^2 = \frac{1}{{2 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 1^2 \cdot \pi }}} \\
{P_{q_1 } = 2} & {\sigma ^2 = \frac{1}{{8 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 2^2 \cdot \pi }}} \\
{P_{q_1 } = 3} & {\sigma ^2 = \frac{1}{{18 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 3^2 \cdot \pi }}} \\
{P_{q_1 } = 4} & {\sigma ^2 = \frac{1}{{32 \cdot \pi }}} & {\sigma ^2 = \frac{1}{{2 \cdot 4^2 \cdot \pi }}} \\
{P_{q_1 } = P_{q_1 } } & \Rightarrow & {\sigma ^2 = \frac{1}{{2 \cdot P_{q_1 } ^2 \cdot \pi }}} \\
\end{array} [/itex]
From this table the equation is found.
[itex]\sigma ^2 = \frac{1}{{2 \cdot \pi \cdot (P_{q_1 } )^2 }} [/itex]
Solving the equation for
[itex]P_{q_1 } = \sqrt {\frac{1}{{2 \cdot \pi \cdot \sigma ^2 }}} = \frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }}
[/itex]
and substituting into the equation
[itex] P(q) = P_{q_1 } \cdot e^{ - \pi \cdot \left(P_{q_1 }\right)^{2} \cdot (q - \mu )^2 } [/itex]
produces the normal distribution.
[itex]P(q) = \frac{1}{{\sqrt {2 \cdot \pi } \cdot \sigma }} \cdot e^{ - \frac{1}{{2 \cdot \sigma ^2 }} \cdot \left( {q -
\mu } \right){}^2} [/itex]
Multiple distributions can be generated in the like manor from multiple values of k. You can also generate multi modal
distributions from different values of N. For example for N=2 and k=1 a true bimodal normal distrubtion could be generated.
You do not have to be restricted to gausian distributions only. By using the same methods as above you could generate several
different types of Intrinsic Distribution. An example shown here is the Cauchy Intrinsic distribution
[itex]P(q) = \sum\limits_{i = 1}^N {\frac{{P_{q_i } }}{{((\left( {q - q_i } \right) \cdot P_{q_i } \cdot N \cdot \varsigma
)^{2 \cdot k} + 1)^k }}} ,\varsigma = \int\limits_{ - \infty }^\infty {(\frac{1}{{(x^{2 \cdot k} + 1)^k }})} dx =
\frac{1}{k} \cdot \beta (\frac{1}{{2 \cdot k}},\frac{{2 \cdot k^2 - 1}}{{2 \cdot k}}),k = 1,2,3,...,\infty [/itex]
Many different distributions are incomplete and can be finished by putting into a form that contains measurable quantities
such as mean and variance. You can use the same method shown above to complete many different distributions such as the
logistic distribution shown here. The Logistic Distribution typically takes the form
[itex]P(q) = \frac{{e^{ - (q - m)/b} }}{{b \cdot \left[ {1 + e^{ - (q - m)/b} } \right]^2 }} [/itex]
and its distribution function
[itex]D(q) = \frac{1}{{1 + e^{ - (q - m)/b} }} [/itex]
The complete form is
[itex]P(q) = 4 \cdot \sqrt {\frac{{\pi ^2 }}{{\sigma ^2 \cdot 48}}} \cdot \left( {\frac{{e^{\left( {\left( {q - \mu }
\right) \cdot 4 \cdot \sqrt {\frac{{\pi ^2 }}{{\sigma ^2 \cdot 48}}} } \right)} }}{{\left( {1 + e^{\left( {\left( {q - \mu }
\right) \cdot 4 \cdot \sqrt {\frac{{\pi ^2 }}{{\sigma ^2 \cdot 48}}} } \right)} } \right)^2 }}} \right) [/itex]
and its complete distribution function
[itex] D(x) = \frac{1}{2} - \frac{1}{{\left( {1 + e^{\left( {\frac{1}{3} \cdot \left( {q - \mu } \right) \cdot \sqrt 3
\cdot \pi \cdot \sqrt {\frac{1}{{\sigma ^2 }}} } \right)} } \right)}}[/itex]
Thousands of different distributions that have never been seen or studied before can be generated or created using similar
techniques shown as above. An example such as this is the Hyperbolic Distribution below.
[itex]P(q) ={\sqrt {\frac{{\pi ^2 }}{{48^2 \cdot \sigma ^2 }}} } \cdot \cosh (2 \cdot \sqrt {\frac{{\pi ^2 }}{{48^2 \cdot
\sigma ^2 }}} \cdot (q - \mu )^{ - 2} ) [/itex]
Multivariate Intrinsic Distributions are also achievable. The Multivariate Cauchy Distribution is shown below.
[itex]P(q_1 ,q_2 ,q{}_3,...,q_j ) = \prod\limits_{j = 1}^m {\sum\limits_{i = 1}^N {\frac{{P_{q_{i,j} }
}}{{((N^{\frac{1}{m}} \cdot \varsigma \cdot (q_j - q_{i,j} ) \cdot P_{q_{i,j} } )^{2 \cdot k} + 1)^k }}(q_j - q_{i,j} )}
},k = 1,2,3,...,\infty [/itex]
[itex] \varsigma = \int\limits_{ - \infty }^\infty {(\frac{1}{{(x^{2 \cdot k} + 1)^k }})} dx = \frac{1}{k} \cdot \beta
(\frac{1}{{2 \cdot k}},\frac{{2 \cdot k^2 - 1}}{{2 \cdot k}}),k = 1,2,3,...,\infty [/itex]
Last edited: