KL Divergence Posted on 03-29-2022 Views: IntroKL散度 KL Divergence$$D(p\Vert q)=\sum p(x)\log{p(x)\over q(x)}$$ KL散度是指分布q(x)被用于近似p(x)时的信息损失. q(x)越不能表达p(x)所包含的信息, KL散度就越大. Reference https://zhuanlan.zhihu.com/p/95687720