Discriminative Transfer Learning Using Similarities and Dissimilarities - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Neural Networks and Learning Systems Année : 2018

Discriminative Transfer Learning Using Similarities and Dissimilarities

Résumé

Correctly estimating the discrepancy between two data distributions has always been an important task in Machine Learning. Recently, Cuturi proposed the Sinkhorn distance which makes use of an approximate Optimal Transport cost between two distributions as a distance to describe distribution discrepancy. Although it has been successfully adopted in various machine learning applications (e.g. in Natural Language Processing and Computer Vision) since then, the Sinkhorn distance also suffers from two unnegligible limitations. The first one is that the Sinkhorn distance only gives an approximation of the real Wasserstein distance, the second one is the `divide by zero' problem which often occurs during matrix scaling when setting the entropy regularization coefficient to a small value. In this paper, we introduce a new Brenier approach for calculating a more accurate Wasserstein distance between two discrete distributions, this approach successfully avoids the two limitations shown above for Sinkhorn distance and gives an alternative way for estimating distribution discrepancy.

Dates et versions

hal-02351587 , version 1 (06-11-2019)

Identifiants

Citer

Ying Lu, Liming Chen, Alexandre Saidi, Emmanuel Dellandréa, Yunhong Wang. Discriminative Transfer Learning Using Similarities and Dissimilarities. IEEE Transactions on Neural Networks and Learning Systems, 2018, pp.1-14. ⟨10.1109/TNNLS.2017.2705760⟩. ⟨hal-02351587⟩
89 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More