Notes on Support Vector Machines - link
Abstract
The most basic idea of support vector machines is to find a line (or hyperplane) that separates classes of data, and use this information to classify new examples. But we don’t want just any line, we want a line that maximizes the distance between classes (we want the best line). This line turns out to be a separating hyperplane that is equidistant between the supporting hyperplanes that ‘support’ the sets that make up each distinct class. The notes that follow discuss the concepts of supporting and separating hyperplanes and inner products as they relate to support vector machines (SVMs). Using simple examples, much detail is given to the mathematical notation used to represent hyperplanes, as well as how the SVM classification works.
Suggested Citation
Matt Bogard. 2012. "Notes on Support Vector Machines" The Selected Works of Matt Bogard
Available at: http://works.bepress.com/matt_bogard/20
No comments:
Post a Comment