假设我有两个 pdf 文件,例如:
from scipy import stats
pdf_y = stats.beta(5, 9).pdf
pdf_x = stats.beta(9, 5).pdf
我想计算他们的 KL divergence 。在我重新发明轮子之前,PyData 生态系统中是否有任何内置函数可以做到这一点?
最佳答案
KL 散度可在 scipy.stats.entropy 中找到。来自文档字符串
stats.entropy(pk, qk=None, base=None)
Calculate the entropy of a distribution for given probability values.
If only probabilities `pk` are given, the entropy is calculated as
``S = -sum(pk * log(pk), axis=0)``.
If `qk` is not None, then compute a relative entropy (also known as
Kullback-Leibler divergence or Kullback-Leibler distance)
``S = sum(pk * log(pk / qk), axis=0)``.
关于python - 连续 pdf 的 KL 散度,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/22097409/