DrummingAtom said:
I'm just confused why definitions don't need a proof.
From the point of view of proof and logic, definitions are arbitrary. They are simply conventions. So there is nothing to prove about a definition. For example, if a book has a passage that says "Let X be a random variable" there is no reason the book needs to
prove that X is a random variable. It is simply adopting the convention that "X" denotes a random variable.
Definitions can be right or wrong from a sociological and cultural point of view. Someone might criticize a definition by saying "That's not how most peopel deifne a ...". However, that type of controversy is subjective so you can't use a proof to settle such matters.
Occasionally books make definitions that only make sense if certain claims are already proven. For example if we say "The number 0 is defined to be the number such that for any real number x, x + 0 = x", this use of the word "the" could be taken to mean that there is one and only one number 0. However that fact that there is only one number with the properties of 0 does need a proof. In rigorous mathematical books, a definition of "a" zero is defined and then a proof is given that there is only one zero, which justifies speaking of "the" number zero.
Likewise, defining the conditiona probability P(A|B) to be \frac{ P(A \cap B)}{P(B)} describes the thing defined as unique ("the") and a "probability", but the definition itself is not a proof that the quantity \frac{P(A \cap B)}{P(B)} is unique or that it is a number in the interval [0,1] as a probability ought to be. So that aspect of the definition does need some proof. However, I think those things are easy to establish.