1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Relative Entropy or Kullback Leibler divergence

  1. May 13, 2015 #1
    1. The problem statement, all variables and given/known data
    I am suppose to calculate the relative entropy between two sets of data:
    Base set
    Set 1:
    A C G T
    0 0 0 10
    0 0 0 10
    0 0 10 0
    0 10 0 0
    10 0 0 0
    * * * * //Randomized
    0 0 0 10
    0 10 0 0

    Set 2:
    A C G T
    0 0 0 10
    0 0 0 10
    0 0 10 0
    0 10 0 0
    10 0 0 0
    1 4 1 4
    0 0 0 10
    0 10 0 0


    These are frequency of occurrence matrices. Set 2 is a matrix created after a variable number of characters is mutated. In this case only 1 character in the 3rd from bottom row was mutated. Thats why this row has no 10s. Every other position didn't mutate so has the correct number of occurrences as compared to set 1. I have 70 other sets of this data with various number of mutations and lengths.

    I am trying to read about this online but the information is convoluted and often seems to actively avoid defining variables. Can someone walk me through the process?

    2. Relevant equations


    3. The attempt at a solution
     
  2. jcsd
  3. May 13, 2015 #2
    Nevermind, I've got it!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Relative Entropy or Kullback Leibler divergence
  1. Entropy of fusion (Replies: 2)

Loading...