6 Scopus citations

Abstract

Existing maximum-margin support vector machines (SVMs) generate a hyperplane which produces the clearest separation between positive and negative feature vectors. These SVMs are effective when datasets are large. However, when few training samples are available, the hyperplane is easily influenced by outliers that are geometrically located in the opposite class. We propose a modified SVM which weights feature vectors to reflect the local density of support vectors and quantifies classification uncertainty in terms of the local classification capability of each training sample. We derive a primal formulation of an SVM that incorporates those modifications, and implement an RC-margin SVM of the simplest form. We evaluated our model on the recognition of handwritten numerals, and obtained higher recognition rate than a standard maximum-margin SVM, a weighted SVM, or an SVM which accounts for classification uncertainty.

Original languageEnglish
Title of host publication2016 IEEE International Conference on Image Processing, ICIP 2016 - Proceedings
PublisherIEEE Computer Society
Pages4438-4442
Number of pages5
ISBN (Electronic)9781467399616
DOIs
StatePublished - 3 Aug 2016
Event23rd IEEE International Conference on Image Processing, ICIP 2016 - Phoenix, United States
Duration: 25 Sep 201628 Sep 2016

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2016-August
ISSN (Print)1522-4880

Conference

Conference23rd IEEE International Conference on Image Processing, ICIP 2016
Country/TerritoryUnited States
CityPhoenix
Period25/09/1628/09/16

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • Classification uncertainty
  • Reduced convex hulls
  • Weighted SVM

Fingerprint

Dive into the research topics of 'Weighted SVM with classification uncertainty for small training samples'. Together they form a unique fingerprint.

Cite this