This measure of distance between probabilities is well known in the area if information theory. Let and
describe two measure of probabilities, the Kullback-Leibler divergence is given by :
This measure of a distance between probabilities has been used in [51] together with the Laplace's law of succession in order to correct corrupted texts using a variable length Markov model.
We can also notice in the computation of the error measure, that the sum has terms and so the computation of a distance measure is computationally expansive.