Testing mathjax support in Hugo

Feb 6, 2024 | 1min read

This is the formula for Kullback-Leibler (hence the KL) divergence between two probability distributions. Essentially, its a way to measure the difference between the predicted probability distribution the haty and the true probability distribution y.


KL(y^||y)=c=1My^clogy^cyc JS(y^||y)=12(KL(y||y+y^2)+KL(y^||y+y^2))

There is a slight issue with what was written here. The inline styling isn’t displaying the way it should. Specifically, the y-hat; that symbol should look the same way it does in the formula – which is just fine.