Event

Boxiang Wang (University of Iowa)

Friday, January 11, 2019 15:30to16:30
Burnside Hall Room 1104, 805 rue Sherbrooke Ouest, Montreal, QC, H3A 0B9, CA

Title: Magic Cross-Validation Theory for Large-Margin Classification

Abstract:

Cross-validation (CV) is perhaps the most widely used tool for tuning supervised machine learning algorithms in order to achieve better generalization error rate. In this paper, we focus on leave-one-out cross-validation (LOOCV) for the support vector machine (SVM) and related algorithms. We first address two wide-spreading misconceptions on LOOCV. We show that LOOCV, ten-fold, and five-fold CV are actually well-matched in estimating the generalization error, and the computation speed of LOOCV is not necessarily slower than that of ten-fold and five-fold CV. We further present a magic CV theory with a surprisingly simple recipe which allows users to very efficiently tune the SVM. We then apply the magic CV theory to demonstrate a straightforward way to prove the Bayes risk consistency of the SVM. We have implemented our algorithms in a publicly available R package magicsvm, which is much faster than the state-of-the-art SVM solvers. We demonstrate our methods on extensive simulations and benchmark examples.

Speaker

Boxiang Wang is an Assistant Professor in the Department of Statistics and Actuarial Science at the the University of Iowa. His research interest lies in statistical learning, statistical computing, high-dimensional data analysis, and optimal design.

CATEGORIES

Follow us on

Back to top