An Information Theoretic Condition for Perfect Reconstruction
RIOUL, Olivier
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
BÉGUINOT, Julien
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
Voir plus >
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
RIOUL, Olivier
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
BÉGUINOT, Julien
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
Communications Numériques [COMNUM]
Département Communications & Electronique [COMELEC]
RABIET, Victor
Centre Sciences des Processus Industriels et Naturels [SPIN-ENSMSE]
École des Mines de Saint-Étienne [Mines Saint-Étienne MSE]
Laboratoire Ondes et Matière d'Aquitaine [LOMA]
Université de Bordeaux [UB]
Centre Sciences des Processus Industriels et Naturels [SPIN-ENSMSE]
École des Mines de Saint-Étienne [Mines Saint-Étienne MSE]
Laboratoire Ondes et Matière d'Aquitaine [LOMA]
Université de Bordeaux [UB]
SOULOUMIAC, Antoine
Laboratoire Instrumentation Intelligente Distribuée et Embarquée (CEA, LIST) [LIIDE (CEA, LIST)]
< Réduire
Laboratoire Instrumentation Intelligente Distribuée et Embarquée (CEA, LIST) [LIIDE (CEA, LIST)]
Langue
en
Article de revue
Ce document a été publié dans
Entropy. 2024-01-19, vol. 26, n° 1, p. 86
MDPI
Résumé en anglais
A new information theoretic condition is presented for reconstructing a discrete random variable X based on the knowledge of a set of discrete functions of X. The reconstruction condition is derived from Shannon’s 1953 ...Lire la suite >
A new information theoretic condition is presented for reconstructing a discrete random variable X based on the knowledge of a set of discrete functions of X. The reconstruction condition is derived from Shannon’s 1953 lattice theory with two entropic metrics of Shannon and Rajski. Because such a theoretical material is relatively unknown and appears quite dispersed in different references, we first provide a synthetic description (with complete proofs) of its concepts, such as total, common, and complementary information. The definitions and properties of the two entropic metrics are also fully detailed and shown to be compatible with the lattice structure. A new geometric interpretation of such a lattice structure is then investigated, which leads to a necessary (and sometimes sufficient) condition for reconstructing the discrete random variable X given a set {X1,…,Xn} of elements in the lattice generated by X. Intuitively, the components X1,…,Xn of the original source of information X should not be globally “too far away” from X in the entropic distance in order that X is reconstructable. In other words, these components should not overall have too low of a dependence on X; otherwise, reconstruction is impossible. These geometric considerations constitute a starting point for a possible novel “perfect reconstruction theory”, which needs to be further investigated and improved along these lines. Finally, this condition is illustrated in five specific examples of perfect reconstruction problems: the reconstruction of a symmetric random variable from the knowledge of its sign and absolute value, the reconstruction of a word from a set of linear combinations, the reconstruction of an integer from its prime signature (fundamental theorem of arithmetic) and from its remainders modulo a set of coprime integers (Chinese remainder theorem), and the reconstruction of the sorting permutation of a list from a minimal set of pairwise comparisons.< Réduire
Mots clés en anglais
information lattice
common information
complementary information
Rajski distance
Shannon distance
dependency coefficient
relative redundancy
convex envelope
perfect reconstruction
Signal processing
Origine
Importé de halUnités de recherche