Virtual sample-based deep metric learning using discriminant analysis

Dae Ha Kim, Byung Cheol Song

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

Deep metric learning (DML) has been designed to maximize the inter-class variance that is the distance between embedding features belonging to different classes. Since conventional DML techniques do not consider the statistical characteristics of the embedding space, or they calculate similarity using only a given feature, they make it difficult to adaptively reflect the characteristics of the feature distribution during the learning process. This paper proposes a virtual metric loss (VML) incorporating with embedding features by using virtual samples produced through linear discriminant analysis (LDA). This study is valuable in that it proposes a new metric that can learn inter-class variance of embedding features by integrating discriminant analysis and metric learning which have a common purpose of inter-class variance analysis. In addition, we theoretically analyze the eigenvalue equation problem and the degree of stabilization in the embedding space. We have verified the performance of the proposed VML through extensive experiments on large and few-shot retrieval datasets. For example, in the CUB200-2011 dataset, the VML showed a recall rate about 0.7% higher than a state-of-the-art method. We also explored a new similarity through virtual samples and adjusted the difficulty of embedding features, thereby confirming the possibility of expanding virtual samples into various fields of pattern recognition.

Original languageEnglish
Article number107643
JournalPattern Recognition
Volume110
DOIs
StatePublished - Feb 2021

Bibliographical note

Publisher Copyright:
© 2020 Elsevier Ltd

Keywords

  • Deep metric learning
  • Linear discriminant analysis
  • Retrieval task

Fingerprint

Dive into the research topics of 'Virtual sample-based deep metric learning using discriminant analysis'. Together they form a unique fingerprint.

Cite this