Calculate mean/stddev on log and real scales

  • Thread starter gummz
  • Start date
  • Tags
    Log
In summary, the logarithm of an object follows a normal distribution with a median of 0.00065 and standard deviation of 0.0038. After 249 days, the mean and standard deviation can be calculated for both the real and logarithmic scales. The logarithmic scale can be understood as taking the natural log of the real scale. The final log after 249 days can be found by summing the daily log changes. The mean and standard deviation for the final log can be calculated using the given values for the mean and standard deviation of the daily log changes. The final value on the real scale can be found by taking the product of the daily values on the log scale. The term "lognormal distribution" can be
  • #1
gummz
32
2

Homework Statement


The logarithm of an object changes between days according to a normal distribution with median 0,00065 and std dev 0,0038. Calculate the mean and std dev after 249 days for the real and logarithmic scales.

Homework Equations


Standard standard deviation as far as I am aware. This type of problem is unbeknownst to me and my textbook.

The Attempt at a Solution


I know the formula for the real scale, but I don't know what they mean by a logarithmic scale. Do they mean to just take log() of the std dev for the real scale?
 
Physics news on Phys.org
  • #2
gummz said:

Homework Statement


The logarithm of an object changes between days according to a normal distribution with median 0,00065 and std dev 0,0038. Calculate the mean and std dev after 249 days for the real and logarithmic scales.

Homework Equations


Standard standard deviation as far as I am aware. This type of problem is unbeknownst to me and my textbook.

The Attempt at a Solution


I know the formula for the real scale, but I don't know what they mean by a logarithmic scale. Do they mean to just take log() of the std dev for the real scale?

If ##X## is the variable on the real scale and ##Y## is on the logarithmic scale, they just mean that ##Y = \ln X## (assuming natural logs). After that, the question is ambiguous, because it is poorly worded. One interpretation would be that ##Y_i## = change of log on day ##i##, so that the final log after ##N## days would be ##W = \sum_{i=1}^N Y_i##. Then, of course, it matters if the ##Y_i## are independent or not, and you did not say whether that is the case. Assuming it IS the case, computing the mean and standard deviation of ##W## is straightforward because you are told the mean and standard deviation of each ##Y_i## separately. In the non-logarithmic scale, the final after ##N## days would be ##U = X_1 X_2 \cdots X_N = \prod_{i=1}^N e^{Y_i}##, because ##\ln U = W = \sum Y_i = \sum \ln(X_i).##. Remember: that is just my interpretation of a not-well-stated question.

As for the rest: Google 'lognormal distribution'.
 

1. What is the purpose of calculating mean and standard deviation on log and real scales?

The purpose of calculating mean and standard deviation on log and real scales is to analyze and compare data sets that have a wide range of values. Using the mean and standard deviation on a log scale allows for a more accurate representation of the data, especially when there are extreme values. On the other hand, using the mean and standard deviation on a real scale allows for a better understanding of the actual values and their distribution.

2. How do you calculate the mean on log and real scales?

To calculate the mean on a log scale, you first take the log of each data point, then find the average of those log values, and finally take the antilog of the average to get the mean on a log scale. To calculate the mean on a real scale, you simply add up all the data points and divide by the total number of data points.

3. What is the difference between mean and standard deviation on log and real scales?

The mean on a log scale represents the central tendency of the data, while the mean on a real scale represents the average value of the data. The standard deviation on a log scale measures the spread or variability of the data around the mean, while the standard deviation on a real scale measures the spread of the data around the average value.

4. How do you interpret the mean and standard deviation on log and real scales?

The mean on a log scale can be interpreted as the geometric mean of the data, which gives a more accurate representation of the data when there are extreme values. The standard deviation on a log scale can be interpreted as the geometric standard deviation, which takes into account the logarithmic nature of the data. The mean and standard deviation on a real scale can be interpreted similarly to traditional statistical measures, reflecting the central tendency and variability of the data.

5. When should you use mean and standard deviation on log and real scales?

Mean and standard deviation on log and real scales should be used when analyzing data sets with a wide range of values, especially when there are extreme values. These measures can provide a more accurate and comprehensive understanding of the data, allowing for more meaningful comparisons and conclusions to be drawn.

Similar threads

  • General Math
Replies
20
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Replies
5
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
8K
  • Calculus and Beyond Homework Help
Replies
4
Views
4K
  • Calculus and Beyond Homework Help
Replies
4
Views
852
Replies
24
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
4K
  • Introductory Physics Homework Help
Replies
3
Views
1K
Back
Top