- #1
re444
- 15
- 0
I have yearly rain amounts and want to estimate the rain with 100 year return period assuming different distribution. I know some ways to do with for example normal dist. but it's not general for all pdfs.
A return period is a statistical measure used to estimate the likelihood of a certain event occurring. It represents the average time between occurrences of an event of a certain magnitude.
Return period amount is estimated by using a statistical distribution that best fits the data. This can be done by plotting the data on a graph and using a mathematical formula to determine the probability of the event occurring at different time intervals.
The most commonly used distributions for estimating return period amount are the Normal distribution, the Lognormal distribution, and the Exponential distribution. Other distributions such as the Weibull and Gamma distributions can also be used depending on the type of data and the shape of the distribution.
Different distributions can affect the estimated return period amount by changing the shape of the probability curve. For example, the Normal distribution assumes a symmetrical curve, while the Lognormal distribution assumes a right-skewed curve. This can result in different probabilities and return period amounts for the same data.
The accuracy of estimated return period amount can be improved by using more robust statistical methods, such as Maximum Likelihood Estimation (MLE), and by incorporating more data points. It is also important to ensure that the chosen distribution is the best fit for the data, as using an incorrect distribution can lead to inaccurate estimates.