BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//132.216.177.160//NONSGML kigkonsult.se iCalcreator 2.18//
BEGIN:VEVENT
UID:20190422T224356EDT-93521fEJvC@132.216.177.160
DTSTAMP:20190423T024356Z
DESCRIPTION:Title: Magic Cross-Validation Theory for Large-Margin Classific
ation\n\n\n Abstract:\n\n\nCross-validation (CV) is perhaps the most widely
used tool for tuning supervised machine learning algorithms in order to a
chieve better generalization error rate. In this paper\, we focus on leave
-one-out cross-validation (LOOCV) for the support vector machine (SVM) and
related algorithms. We first address two wide-spreading misconceptions on
LOOCV. We show that LOOCV\, ten-fold\, and five-fold CV are actually well
-matched in estimating the generalization error\, and the computation spee
d of LOOCV is not necessarily slower than that of ten-fold and five-fold C
V. We further present a magic CV theory with a surprisingly simple recipe
which allows users to very efficiently tune the SVM. We then apply the mag
ic CV theory to demonstrate a straightforward way to prove the Bayes risk
consistency of the SVM. We have implemented our algorithms in a publicly a
vailable R package magicsvm\, which is much faster than the state-of-the-a
rt SVM solvers. We demonstrate our methods on extensive simulations and be
nchmark examples.\n\n\n Speaker\n\n\nBoxiang Wang is an Assistant Professor
in the Department of Statistics and Actuarial Science at the the Universi
ty of Iowa. His research interest lies in statistical learning\, statistic
al computing\, high-dimensional data analysis\, and optimal design.\n\nCAT
EGORIES\n
DTSTART:20190111T203000Z
DTEND:20190111T213000Z
LOCATION:Room 1104\, Burnside Hall\, CA\, QC\, Montreal\, H3A 0B9\, 805 rue
Sherbrooke Ouest
SUMMARY:Boxiang Wang (University of Iowa)
URL:https://mcgill.ca/channels/channels/event/boxiang-wang-university-iowa-
293009
END:VEVENT
END:VCALENDAR