Random Forests and Regularized Least Squares Classifiers

Kari Torkkola, Motorola Labs Torkkola@motorola.com
Eugene Tuv, Intel eugene.tuv@intel.com

Recent work by Poggio, Smale et al has shown that Regularized Least Squares Classifier (RLSC), an old combination of square loss function, and regularization in reproducing kernel Hilbert space, is still a viable option for binary classification problems. We apply RLSC to the datasets of NIPS 2003 feature selection challenge using Gaussian kernels. Since  RLSC is somewhat sensitive to noise variables, variable selection is applied first. A Random Forest (RF) is trained for the classification task, and an importance measure for each variable is derived from the forest. RLSC makes then use of the best-ranked variables only. Although regularization and bagging have been shown to be equivalent, we found that stochastic ensembles of least squares classifiers generally outperform any single RLSC. This combination produced competitive results in the NIPS 2003 feature selection challenge.