Modeling modeling: how to interpret the Kullab-Leibler divergence

After getting different probability distributions from different documents in the hammer, I applied the following code to calculate the difference KL between the first and second documents:

        Maths.klDivergence(double[] d1,double[] d2);

How to interpret the received data? For example, I get: 12.3640 ... What does this mean? Are the two distributions close or far?

+4
source share
1 answer

, KL- . , , , : " , . , .

0

Source: https://habr.com/ru/post/1531313/


All Articles