What’s the maximum value of Kullback-Leibler (KL) divergence

I am going to use KL divergence in my python code and I got this tutorial.

On that tutorial, to implement KL divergence is quite simple.

kl = (model * np.log(model/actual)).sum()


As I understand, the probability distribution of model and actual should be <= 1.

My question is, what’s the maximum bound/maximum possible value of k?. I need to know the maximum possible value of kl distance as for the maximum bound in my code.

Or even with the same support, when one distribution has a much fatter tail than the other. Take

when

then

and

There exist other distances that remain bounded such as

• the $L¹$ distance, equivalent to the total variation distance,
• the Wasserstein distances
• the Hellinger distance