RESEARCH ARTICLE


Margin Based Dimensionality Reduction and Generalization



Jing Peng *, 1, Stefan Robila 1, Wei Fan 2, Guna Seetharaman 3
Computer Science Department, Montclair State University, Montclair, NJ 07043, USA
IBM T.J. Watson Research, Hawthorne, NY 10532, USA
Computing Technology Applications Branch, Air Force Research Laboratory, Ohio, USA


© 2017 Peng et al.;

open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: https://creativecommons.org/licenses/by/4.0/legalcode. This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

* Address correspondence to this author at the Computer Science Department, Montclair State University, Montclair, NJ 07043; USA; Tel: 973-655-7975; Fax: 973-655-4164; E-mail: peng@pegasus.montclair.edu


Abstract

Linear discriminant analysis (LDA) for dimension reduction has been applied to a wide variety of problems such as face recognition. However, it has a major computational difficulty when the number of dimensions is greater than the sample size. In this paper, we propose a margin based criterion for linear dimension reduction that addresses the above problem associated with LDA. We establish an error bound for our proposed technique by showing its relation to least squares regression. In addition, there are well established numerical procedures such as semi-definite programming for optimizing the proposed criterion. We demonstrate the efficacy of our proposal and compare it against other competing techniques using a number of examples.

Keywords: Dimensionality reduction, linear discriminent analysis, Margin criterion, semi-definite programming, Small sample size problem.