Difference between revisions of "KL Divergence"

From Cohen Courses
Jump to navigationJump to search
(Created page with 'Its a metric which is used to measure the similarity between two probability distribution P and Q. For probability distributions ''P'' and ''Q'' of a [[discrete random variable]…')
 
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
Its a metric which is used to measure the similarity between two probability distribution P and Q.
+
Its a metric which is used to measure the similarity between two probability distribution ''P'' and ''Q''.
  
For probability distributions ''P'' and ''Q'' of a [[discrete random variable]]
+
For probability distributions ''P'' and ''Q'' of a discrete random variable
 
their K–L divergence is defined to be
 
their K–L divergence is defined to be
  
 
:<math>D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!</math>
 
:<math>D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!</math>

Latest revision as of 00:54, 1 April 2011

Its a metric which is used to measure the similarity between two probability distribution P and Q.

For probability distributions P and Q of a discrete random variable their K–L divergence is defined to be