Analysis of families of divergences to compare Gaussian processes modeled by sums of complex exponentials disturbed by additive white noises
Langue
EN
Article de revue
Ce document a été publié dans
Digital Signal Processing. 2022-04, vol. 123, p. 103436
Résumé en anglais
The purpose of this paper is first to derive the expressions of various divergences that can be expressed from the Chernoff coefficient in order to compare two probability density functions of vectors storing k consecutive ...Lire la suite >
The purpose of this paper is first to derive the expressions of various divergences that can be expressed from the Chernoff coefficient in order to compare two probability density functions of vectors storing k consecutive samples of a sum of complex exponentials disturbed by an additive white noise. This includes the Chernoff divergence and the α-divergence for instance. Tsallis, reversed Tsallis and Sharma-Mittal divergences are also addressed as well as the β-, γ- and αγ-divergences. The behaviors of the divergences are studied when k increases and tends to infinity. Depending on the divergence used, the divergence rate or the asymptotic normalized increment is considered. Expressions that encompass the divergence rate or the asymptotic normalized increment of the divergences are also given. Comments and illustrations to compare random processes are then given. This study makes it possible to show the advantages of the Kullback-Leibler divergence when studying this type of process.< Réduire
Mots clés en anglais
Kullback-Leibler divergence
Chernoff coefficient
Divergence rate
Process comparison
Sum of noisy complex exponentials
Unités de recherche