Standard deviation change problem

tzx9633

Homework Statement


A supplier converts the weights on the cement packages she sends out from Ounces to kilograms (1kg = 35.27 oz).

How does this affect the standard deviation of the weights?

Homework Equations

The Attempt at a Solution


[/B]
My ans is the standard deviation remain constant , because standard deviation is used to measure how ' spread ' is the data from the mean . Because all value now will decrease by a factor of 35.27 , so the standard deviation stays the same . Correct me if i am wrong .
 
Last edited by a moderator:
Physics news on Phys.org
tzx9633 said:

Homework Statement


A supplier converts the weights on the cement packages she sends out from Ounces to kilograms (1kg = 35.27 oz).

How does this affect the standard deviation of the weights?

Homework Equations

The Attempt at a Solution


[/B]
My ans is the standard deviation remain constant , because standard deviation is used to measure how ' spread ' is the data from the mean . Because all value now will decrease by a factor of 35.27 , so the standard deviation stays the same .
Your reasoning is at odds with your conclusion. Since all the values are changing, then the mean will be different. Keep in mind that the units of the mean and standard deviation are the same as those of the data, so if the units change, the values of both the mean and standard deviation will change as well.
tzx9633 said:
Correct me if i am wrong .
 
Your argument is completely heuristic and unfortunately wrong. I suggest you look at the definition of the standard deviation and think about what would happen if all your numbers were divided by 35.27.
 
  • Like
Likes tzx9633
Orodruin said:
Your argument is completely heuristic and unfortunately wrong. I suggest you look at the definition of the standard deviation and think about what would happen if all your numbers were divided by 35.27.
In statistics, the standard deviation (SD, also represented by the Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values.[1] A low standard deviation indicates that the data points tend to be close to the mean (also called the expected value) of the set, while a high standard deviation indicates that the data points are spread out over a wider range of values.

So , when all the values divided by 35.27 , the magnitude of measure of dispersion still the same , right ?
 
tzx9633 said:
In statistics, the standard deviation (SD, also represented by the Greek letter sigma σ or the Latin letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data values.[1] A low standard deviation indicates that the data points tend to be close to the mean (also called the expected value) of the set, while a high standard deviation indicates that the data points are spread out over a wider range of values.

So , when all the values divided by 35.27 , the magnitude of measure of dispersion still the same , right ?
I am talking about the mathematical definition of standard deviation. You will not get very far with a verbal description from Wikipedia.
 
Orodruin said:
I am talking about the mathematical definition of standard deviation. You will not get very far with a verbal description from Wikipedia.
It is a measure that is used to quantify the amount of variation or dispersion of a set of data values , So,is my concept correct ?
So , when all the values divided by 35.27 , the magnitude of measure of dispersion still the same , right ?
 
tzx9633 said:
So,is my concept correct ?
So , when all the values divided by 35.27 , the magnitude of measure of dispersion still the same , right ?
No, you are wrong and you will continue to be wrong until you do the effort of looking up the actual definition of standard deviation and think about how it changes when you divide all values.
 
  • Like
Likes tzx9633
Orodruin said:
No, you are wrong and you will continue to be wrong until you do the effort of looking up the actual definition of standard deviation and think about how it changes when you divide all values.
what is the actual definition of standard deviation ? Isn't that It is a measure that is used to quantify the amount of variation or dispersion of a set of data values ??

Can you tell me why my idea is wrong .?
 
tzx9633 said:
Isn't that It is a measure that is used to quantify the amount of variation or dispersion of a set of data values ??

Can you tell me why my idea is wrong .?
That is just a verbal description that in no way tells you how the standard deviation is constructed. You need the actual definition, which you should be able to find later on the Wikipedia page.
 
  • Like
Likes tzx9633
  • #10
Also, consider the following: What would be the dispersion of the values if you multiplied everything by zero? By your reasoning, it should remain the same.
 
  • Like
Likes tzx9633
  • #11
Orodruin said:
Also, consider the following: What would be the dispersion of the values if you multiplied everything by zero? By your reasoning, it should remain the same.
Multiplying by 0 ? All values will become 0 , and also the standard deviaton too , what are you trying to say here ?
 
  • #12
tzx9633 said:
Multiplying by 0 ? All values will become 0 , and also the standard deviaton too , what are you trying to say here ?
By your argumentation, the standard deviation should be the same since it does not matter if you multiply the values by a number. Therefore, clearly, in the limit of dividing by a big number, your argument for the standard deviation remaining constant fails.
 
  • Like
Likes tzx9633
  • #13
Orodruin said:
By your argumentation, the standard deviation should be the same since it does not matter if you multiply the values by a number. Therefore, clearly, in the limit of dividing by a big number, your argument for the standard deviation remaining constant fails.
well , i can understand this , i am still not convinced that as all the values in the data set changes , the standard deviation change too , can you provide more example ?
 
  • #14
Have you looked up the actual definition of the standard deviation yet? Until you have done that, there is really nothing to discuss.
 
  • #15
Orodruin said:
Have you looked up the actual definition of the standard deviation yet? Until you have done that, there is really nothing to discuss.
The Standard Deviation is a measure of how spread out numbers are.
 
  • #16
tzx9633 said:
The Standard Deviation is a measure of how spread out numbers are.
You are not going to get anywhere just repeating the same thing. Please find and quote the actual definition. Until you do i will not answer again.
 
  • #17
Orodruin said:
You are not going to get anywhere just repeating the same thing. Please find and quote the actual definition. Until you do i will not answer again.
i really have no idea how to get it . Can you give some hints where to get the actual definition of standrad deviation meaning ??
 
  • #18
tzx9633 said:
The Standard Deviation is a measure of how spread out numbers are.
That merely describes standard deviation, but isn't the definition for it. The mean is a measure of where the "middle" of a set of numbers is, but that only describes the mean, but doesn't define it. By definition, the mean of a sample of size N, usually denoted as ##\bar x## is defined to be ##\frac 1 N \sum_{i = 1}^N x_i##.
If you are taking a class in statistics, your textbook will have a definition for both the population standard deviation and the sample standard deviation. If your class isn't statistics, do a web search for "standard deviation."
 
  • #19
tzx9633 said:
i really have no idea how to get it . Can you give some hints where to get the actual definition of standrad deviation meaning ??

Do you honestly need to be told the following?
(1) Look in your textbook.
(2) Google "standard deviation".
 
  • Like
Likes Orodruin
  • #20
Mark44 said:
That merely describes standard deviation, but isn't the definition for it. The mean is a measure of where the "middle" of a set of numbers is, but that only describes the mean, but doesn't define it. By definition, the mean of a sample of size N, usually denoted as ##\bar x## is defined to be ##\frac 1 N \sum_{i = 1}^N x_i##.
If you are taking a class in statistics, your textbook will have a definition for both the population standard deviation and the sample standard deviation. If your class isn't statistics, do a web search for "standard deviation."
I believe that this question does not deal with a sample as much as a distribution. However, the issue is still the same, the expectation value of a distribution ##X## is defined as
$$
E[X] = \sum_i X_i P(X_i)
$$
where ##X_i## are the possible outcomes and ##P(X_i)## is the probability of the outcome ##X_i##.
 
  • #21
Ray Vickson said:
Do you honestly need to be told the following?
(1) Look in your textbook.
(2) Google "standard deviation".
It is not even that hard, it is only a matter of reading further than the lead paragraph on the Wikipedia page he has already found...
 
  • #22
Orodruin said:
I believe that this question does not deal with a sample as much as a distribution. However, the issue is still the same, the expectation value of a distribution ##X## is defined as
$$
E[X] = \sum_i X_i P(X_i)
$$
where ##X_i## are the possible outcomes and ##P(X_i)## is the probability of the outcome ##X_i##.
Yes, I think you're right. However, as the OP didn't seem to understand the basics of standard deviations, I wasn't sure he/she would understand the concepts of expectation of a random variable (E[X]) or its variance, Var[X].
 
  • #23
tzx9633 said:
i really have no idea how to get it . Can you give some hints where to get the actual definition of standrad deviation meaning ??
Really? You can't type "standard deviation" into your web browser? It should take you no more than a minute to find the mathematical definition of standard deviation. What ever...
Consider N of samples and let each sample value be represented by ##x_i## where i is an index to the sequence of values. The mean of the N samples is $$\bar x =\frac {1}{N}\sum_{i=1}^N x_i$$
and the standard deviation is $$\sigma=\sqrt {\frac {\sum_{i=1}^N (x_i - \bar x)^2}{N-1}} $$
Now do an experiment. Make up ten values and find the mean and standard deviation. Now divide all the values by ten and find the mean and standard deviation. Math requires calculation not wishful conjecture.
 
  • Like
Likes tzx9633
  • #24
Mark44 said:
Yes, I think you're right. However, as the OP didn't seem to understand the basics of standard deviations, I wasn't sure he/she would understand the concepts of expectation of a random variable (E[X]) or its variance, Var[X].
Agreed, which is the root of the problem here anyway as (s)he is trying to solve a problem involving the basic of standard deviations ...

Fred Wright said:
Really? You can't type "standard deviation" into your web browser? It should take you no more than a minute to find the mathematical definition of standard deviation. What ever...
Consider N of samples and let each sample value be represented by ##x_i## where i is an index to the sequence of values. The mean of the N samples is $$\bar x =\frac {1}{N}\sum_{i=1}^N x_i$$
and the standard deviation is $$\sigma=\sqrt {\frac {\sum_{i=1}^N (x_i - \bar x)^2}{N-1}} $$
Now do an experiment. Make up ten values and find the mean and standard deviation. Now divide all the values by ten and find the mean and standard deviation. Math requires calculation not wishful conjecture.
Again, that is the estimate of the standard deviation of a sample. I still believe that the question calls for the standard deviation of a statistical distribution. However, as already stated, they are related and for this problem should be equivalent.
 
  • #25
Assume your initial mean is ##\mu## and one of the values is ##x##. Then difference between the two is ##\mu -x ##.
Now , multiply all by ##35.27##. Then the difference between he two is ## 35.27\mu -35.27x=35.27(\mu -x) > \mu -x ##...
 
Back
Top