Kullback-Leibler divergence is used to measure the distance between the posteriors of the autoregressive model order, aiming to evaluate the sensitivity of the model identification to different types of priors.