This is the formula for Kullback-Leibler (hence the KL) divergence between two probability distributions. Essentially, its a way to measure the difference between the predicted probability distribution the \(hat{y}\) and the true probability distribution \(y\).
\begin{aligned}
KL(\hat{y} || y) &= \sum_{c=1}^{M}\hat{y}_c \log{\frac{\hat{y}_c}{y_c}} \
JS(\hat{y} || y) &= \frac{1}{2}(KL(y||\frac{y+\hat{y}}{2}) + KL(\hat{y}||\frac{y+\hat{y}}{2}))
\end{aligned}
There is a slight issue with what was written here. The inline styling isn’t displaying the way it should. Specifically, the y-hat; that symbol should look the same way it does in the formula – which is just fine.