Difference between revisions of "KL Divergence"
From Cohen Courses
Jump to navigationJump to search (Created page with 'Its a metric which is used to measure the similarity between two probability distribution P and Q. For probability distributions ''P'' and ''Q'' of a [[discrete random variable]…') |
|||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | Its a metric which is used to measure the similarity between two probability distribution P and Q. | + | Its a metric which is used to measure the similarity between two probability distribution ''P'' and ''Q''. |
− | For probability distributions ''P'' and ''Q'' of a | + | For probability distributions ''P'' and ''Q'' of a discrete random variable |
their K–L divergence is defined to be | their K–L divergence is defined to be | ||
:<math>D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!</math> | :<math>D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!</math> |
Latest revision as of 23:54, 31 March 2011
Its a metric which is used to measure the similarity between two probability distribution P and Q.
For probability distributions P and Q of a discrete random variable their K–L divergence is defined to be