Calculus of Variations on Kullback-Liebler Divergence

  • I
  • Thread starter Master1022
  • Start date
  • #1
559
110
Summary:
How to use calculus of variations on KL-divergence
Hi,

This isn't a homework question, but a side task given in a machine learning class I am taking.

Question: Using variational calculus, prove that one can minimize the KL-divergence by choosing ##q## to be equal to ##p##, given a fixed ##p##.

Attempt:

Unfortunately, I have never seen calculus of variations (it was suggested that we teach ourselves). I have been trying to watch some videos online, but I mainly just see references to Euler-Lagrange equations which I don't think are of much relevance here (please correct me if I am wrong) and not much explanation of the functional derivatives.

Nonetheless, I think this shouldn't be too hard, but am struggling to understand how to use the tools.

If we start with the definition of the KL-divergence we get:
[tex] \text{KL}[p||q] = \int p(x) log(\frac{p(x)}{q(x)}) dx = I [/tex]

Would it be possible for anyone to help me get started on the path? I am not sure how to proceed really after I write down ## \frac{\delta I}{\delta q} ##?

Thanks in advance
 

Answers and Replies

  • #2
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
4,773
745
Euler Lagrange is what you want, but you also have to worry about the conditions that you have on q that come from it being a probability distribution, namely that the integral is 1 and it's always nonnegative. I think the integral constraint is the important part

http://liberzon.csl.illinois.edu/teaching/cvoc/node38.html

Has some notes on how to add constraints to the euler Lagrange equations.
 

Related Threads on Calculus of Variations on Kullback-Liebler Divergence

  • Last Post
Replies
2
Views
643
Replies
2
Views
744
Replies
28
Views
2K
Replies
4
Views
3K
Replies
1
Views
762
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
8
Views
5K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
3
Views
1K
Replies
4
Views
4K
Top