Stephen Tashi
Science Advisor
Homework Helper
Education Advisor
- 7,864
- 1,602
It gets too confusing to discuss more than one or two issues per post. Let's start with the first passage you mentioned.
Look at the statement P(A) = \lim_{n \rightarrow \infty}\frac{n_A}{n}. This is an intellectually dishonest statement. It treats a stochastic outcome n_A as if it were an ordinary real valued variable and it invites the reader to think that lim_{n \rightarrow \infty} refers to the type of limit defined in elementary calculus.
Look for other sources to find a respectable statement of what "the relative frequency interpretation of probability" is.
You should also study the Law of Large Numbers. You like to mention it - figure out what it really says.
The limits involved in the weak and strong law of large numbers do not have the same definition as a limit in introductory calculus. These limits have modifiers like "in probability" or "almost surely". Until you work through how these limits are defined, you don't understand the law of large numbers.
(By the way, the Von Mises approach to probability theory should be completely written off. In browsing the web, I see that people may have fixed it up to be rigorous and it apparently has a connection to attempts to connect randomness with computability.)
Look at the statement P(A) = \lim_{n \rightarrow \infty}\frac{n_A}{n}. This is an intellectually dishonest statement. It treats a stochastic outcome n_A as if it were an ordinary real valued variable and it invites the reader to think that lim_{n \rightarrow \infty} refers to the type of limit defined in elementary calculus.
Look for other sources to find a respectable statement of what "the relative frequency interpretation of probability" is.
You should also study the Law of Large Numbers. You like to mention it - figure out what it really says.
The limits involved in the weak and strong law of large numbers do not have the same definition as a limit in introductory calculus. These limits have modifiers like "in probability" or "almost surely". Until you work through how these limits are defined, you don't understand the law of large numbers.
(By the way, the Von Mises approach to probability theory should be completely written off. In browsing the web, I see that people may have fixed it up to be rigorous and it apparently has a connection to attempts to connect randomness with computability.)