Abstract
Existing maximum-margin support vector machines (SVMs) generate a hyperplane which produces the clearest separation between positive and negative feature vectors. These SVMs are effective when datasets are large. However, when few training samples are available, the hyperplane is easily influenced by outliers that are geometrically located in the opposite class. We propose a modified SVM which weights feature vectors to reflect the local density of support vectors and quantifies classification uncertainty in terms of the local classification capability of each training sample. We derive a primal formulation of an SVM that incorporates those modifications, and implement an RC-margin SVM of the simplest form. We evaluated our model on the recognition of handwritten numerals, and obtained higher recognition rate than a standard maximum-margin SVM, a weighted SVM, or an SVM which accounts for classification uncertainty.
Original language | English |
---|---|
Title of host publication | 2016 IEEE International Conference on Image Processing, ICIP 2016 - Proceedings |
Publisher | IEEE Computer Society |
Pages | 4438-4442 |
Number of pages | 5 |
ISBN (Electronic) | 9781467399616 |
DOIs | |
State | Published - 3 Aug 2016 |
Event | 23rd IEEE International Conference on Image Processing, ICIP 2016 - Phoenix, United States Duration: 25 Sep 2016 → 28 Sep 2016 |
Publication series
Name | Proceedings - International Conference on Image Processing, ICIP |
---|---|
Volume | 2016-August |
ISSN (Print) | 1522-4880 |
Conference
Conference | 23rd IEEE International Conference on Image Processing, ICIP 2016 |
---|---|
Country/Territory | United States |
City | Phoenix |
Period | 25/09/16 → 28/09/16 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.
Keywords
- Classification uncertainty
- Reduced convex hulls
- Weighted SVM