MHB What is the probability the second orange is sour if the first one is sour?

  • Thread starter Thread starter evinda
  • Start date Start date
Click For Summary
The discussion centers on calculating the probability that the second orange is sour given that the first one is sour, based on two farmers' oranges. Farmer A has 10% sour oranges, while Farmer B has 4%. The client randomly selects a farmer and then two oranges, leading to the application of Bayes' Theorem for the calculation. The probability of both oranges being sour is derived from the individual probabilities of selecting each farmer and their respective sour orange rates. The final computed probability of the second orange being sour, given the first is sour, is 29/350.
evinda
Gold Member
MHB
Messages
3,741
Reaction score
0
Hello! (Wave)

A farmer $Α$ has oranges, $10 \%$ of which are sour. A farmer $Β$ has oranges, $4\%$ of which are sour. A client chooses per chance ( with propability $\frac{1}{2}$) two oranges.
Which is the probability, if the first orange that he chooses is sour, that the second is also sour?

I drawed the following diagram:View attachment 3911Could this help? (Thinking)
 

Attachments

  • Tbiwo.jpg
    Tbiwo.jpg
    22.4 KB · Views: 82
Physics news on Phys.org
Hi! (Thinking)

That should help, but I think a couple of levels are missing in the decision tree.

I think the first decision the client makes (or rather, the first probability event that occurs) is whether he chooses farmer A or farmer B. (Thinking)
So the root of the tree should have 2 children: A and B, each with probability 1/2.

What do you think the second decision or event is? (Wondering)
 
:Hello, evinda!

Farmer $Α$ has oranges, 10% of which are sour.
Farmer $Β$ has oranges, 4% of which are sour.
A client chooses a farmer at random, then two of his oranges.
What is the probability, if the first orange is sour,
that the second is also sour?
Bayes' Theorem: $\:P(\text{2nd sour }|\text{1st sour}) \;=\;\dfrac{P(\text{both sour})}{P(\text{1st sour})}$

$P(A) = \frac{1}{2}$
$P(A \wedge \text{1st sour}) = \frac{1}{2}\frac{10}{100}$
$P(A \wedge \text{both sour}) = \frac{1}{2}(\frac{10}{100})^2$

$P(B) = \frac{1}{2}$
$P(B \wedge \text{1st sour}) = \frac{1}{2}\frac{4}{100}$
$P(B \wedge \text{both sour}) = \frac{1}{2}(\frac{4}{100})^2$

$P(\text{both sour}) \:=\: \frac{1}{2}(\frac{10}{100})^2 + \frac{1}{2}(\frac{4}{100})^2 \:=\:\frac{116}{20,000} \:=\:\frac{29}{5000}$

$P(\text{1st sour}) \:=\:\frac{10}{200} + \frac{4}{200} \:=\:\frac{14}{200} \:=\:\frac{7}{100}$

Therefore: $\:P(\text{2nd sour }|\text{ 1st sour}) \;=\; \dfrac{\frac{29}{5000}}{\frac{7}{100}} \;=\;\dfrac{29}{350}$
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 53 ·
2
Replies
53
Views
2K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
10
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K