Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Dependencies of Inference on Information Theory.

  1. Jul 2, 2014 #1
    I understand how using classical or bayesian statistical inference os often very helpful for solving information theory problems, or for improvements in data managing or manipulation of learning algorithms. But the other way around (using I.T knowledge to find a way in inference), I can't find it clear enough. Is information theory knowledge necessary (or at least recommended) for solving inference problems, like parameter estimation for example?
     
  2. jcsd
  3. Jul 2, 2014 #2

    marcusl

    User Avatar
    Science Advisor
    Gold Member

    There is a huge literature on using information theory and Bayesian inference to perform parameter estimation. In many (most) problems, the number of hypotheses that must be tested is astronomically large, precluding a direct solution. The literature is full, therefore, of approximations and compromises to make an estimation problem practical.

    Sometimes an exact solution is possible. One example is in detecting the presence of a radar return in noise. Cook and Bernfeld's text "Radar Signals" shows that in this case, the same optimal detector design results from a) maximizing the output signal-to-noise ratio, b) applying statistical decision theory, and c) solving the problem using Bayesian inverse probability.
     
    Last edited: Jul 2, 2014
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Dependencies of Inference on Information Theory.
Loading...