Improved Estimation of the Distance between Covariance Matrices - CICS Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Improved Estimation of the Distance between Covariance Matrices

Résumé

A wide range of machine learning and signal processing applications involve data discrimination through covariance matrices. A broad family of metrics, among which the Frobe-nius, Fisher, Bhattacharyya distances, as well as the Kullback-Leibler or Rényi divergences, are regularly exploited. Not being directly accessible, these metrics are usually assessed through empirical sample covariances. We show here that, for large dimensional data, these approximations lead to dramatically erroneous distance and divergence estimates. In this article, based on advanced random matrix considerations , we provide a novel and versatile consistent estimate for these covariance matrix distances and divergences. While theoretically developed for both large and numerous data, practical simulations demonstrate its large performance gains over the standard approach even for very small dimensions. A particular emphasis is made on the Fisher information metric and a concrete application to covariance-based spectral clustering is investigated. Index Terms-Covariance distance, random matrix theory , Fisher information metric.
Fichier principal
Vignette du fichier
couillet_Fisher_distance_ICASSP.pdf (293.93 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02355321 , version 1 (19-05-2020)

Identifiants

Citer

Malik Tiomoko, Romain Couillet, Eric Moisan, Steeve Zozor. Improved Estimation of the Distance between Covariance Matrices. ICASSP 2019 - IEEE International Conference on Acoustics, Speech and Signal Processing, May 2019, Brighton, United Kingdom. pp.7445-7449, ⟨10.1109/ICASSP.2019.8682621⟩. ⟨hal-02355321⟩
144 Consultations
378 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More