Kullback-Leibler divergence is used to measure the distance between the posteriors of the autoregressive model coefficients, aiming to evaluate the sensitivity of the coefficients posterior to different types of priors.