1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Minimax Error (decision theory problem)

  1. Jul 20, 2010 #1
    1. The problem statement, all variables and given/known data

    I've been trying to teach myself some Decision Theory using "Pattern Classification" by Duda, Heart, and Stork. It's a great book but the homework problems are giving me a lot of trouble. This one, concerning minimax, is particularly difficult:

    "Assume we have one-dimensional Gaussian distributions [tex]p(x|\omega_{i})=N(\mu_{i},\sigma_{i}^{2})[/tex] for i=1,2, but completely unknown prior probabilities. Use the minimax criterion to find the optimal decision point x* in terms of [tex]\mu_{i}[/tex] and [tex]\sigma_{i}[/tex] under a zero-one risk."

    Just to clarify, the zero-one risk means the loss function is 1 for an incorrect decision, and 0 for a correct decision.

    2. The attempt at a solution

    Now, my understanding of the minimax criterion is that we want to minimize the maximum Bayesian error for all possible priors. So I was thinking that the goal is to maximize the conditional Bayesian risk:

    [tex]a_{i}[/tex] = action associated with selecting [tex]i[/tex]
    [tex]\omega_{j}[/tex] = the so-called "state of nature" (tells whether [tex]j[/tex] is true)
    [tex]x[/tex] = the data
    [tex]c[/tex] = number of states of nature):

    [tex]R(a_i|x) = \sum_{j}^c cost(a_{i}|\omega_{j}) \cdot P(\omega_{j}|x)[/tex]

    Since the loss function is just 1 if [tex]i \neq j[/tex] and 0 if [tex]i=j[/tex],

    [tex]R(a_{i}|x) = \sum_{j \neq i}^c P(\omega_{j}|x)[/tex]
    [tex]= 1 - P(\omega_{i}|x)[/tex]

    OK so this much is already developed in the book. I figure all I have to do is maximize the conditional error [tex]R(a_{i}|x)[/tex] and I'll have the "decision point" - the point at which you have the maximal Bayesian Risk, which should (I think) be independent of the prior probabilities [tex]P(\omega_{i})[/tex]. So that's what I did:

    (here I'll use the abbreviation [tex]N_{i} = N(\mu_{i},\sigma_{i}^2)[/tex] = normal distribution with mean [tex]\mu_{i}[/tex] and variance [tex]\sigma_{i}^{2}[/tex]):

    [tex]R(a_{i}|x) = 1 - P(\omega_{i}|x)[/tex]
    [tex] = 1 - \frac{p(x|\omega_{i}) \cdot P(\omega_{i})}{p(x)}[/tex] (Bayes' Rule)
    [tex] = 1 - \frac{N_{i} \cdot P(\omega_{i})}{N_{i} \cdot P(\omega_{i})+N_{j} \cdot P(\omega_{j})} [/tex]
    [tex] = 1 - \frac{1}{1+\frac{P(\omega_{j})}{P(\omega_{i})} \cdot \frac{N_{j}}{N_{i}}} [/tex]

    To minimize this, we maximize [tex]N_{j}/N_{i}[/tex] (thus independent of priors). I maximize this just by taking the first derivative and setting = 0:

    [tex]0 = \frac{N_{i} \cdot N_{j}' - N_{j} \cdot N_{i}'}{N_{i}^2}[/tex]
    [tex]\Rightarrow N_{i} \cdot N_{j}' = N_{j} \cdot N_{i}'[/tex]
    [tex]\Rightarrow N_{i} \cdot \frac{-2 \cdot (x-\mu_{j})}{(2\sigma_{j}^{2})^{2}} \cdot N_{j} = N_{j} \cdot \frac{-2 \cdot (x-\mu_{i})}{(2\sigma_{i}^{2})^{2}} \cdot N_{i}[/tex] (deriv. of normal)
    [tex]\Rightarrow \frac{x-\mu_{j}}{\sigma_{j}^{4}} = \frac{x-\mu_{i}}{\sigma_{i}^{4}}[/tex]
    [tex]\Rightarrow x = \frac{\sigma_{i}^{4} \cdot \mu_{j} - \sigma_{j}^{4} \cdot \mu_{i}}{\sigma_{i}^4 - \sigma_{j}^{4}}[/tex]

    So that's what I think x* is. The trouble is it doesn't make too much sense (what happens when [tex]\sigma_{i} = \sigma_{j}[/tex] ?). And also I'm not totally sure this is the right way of thinking about things.

    Thanks for any help you can provide, even if it's just to tell me I'm completely wrong about my approach.
  2. jcsd
  3. Jul 21, 2010 #2
    you might want to look at expectimax since you are dealing with probability
  4. Sep 2, 2012 #3
    I don't think that you use "the the minimax criterion" when solving this problem, so the answer cannot make sure that it is indepent of the prior probabilities...
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook