Monday, May 28, 2012

Notes to 'Support' an Understanding of Support Vector Machines

Some time ago I wrote a short article highlighting the similarities of tools used in economics, bioinformatics, and machine learning (Mathematical Themes in Economics, Machine Learning, and Bioinformatics). In this followup I expand more on the details 'supporting' support vector machines. While not appropriate for making the kinds of inferences econometricians are so often fond of, they are powerful tools for classification. They are certainly an artifact of  the data mining culture.  While I don't cover all of the details (for instance I don't discuss kernel methods, or duality) I cover some of the basic concepts that are likely to trip up someone new to SVMs, like how a line in R2 or hyperplane in higher dimensions can be expressed using inner products  as in <w,x> + b = 0.

Notes on Support Vector Machines - link

Matt Bogard, Western Kentucky University



The most basic idea of support vector machines is to find a line (or hyperplane) that separates classes of data, and use this information to classify new examples. But we don’t want just any line, we want a line that maximizes the distance between classes (we want the best line). This line turns out to be a separating hyperplane that is equidistant between the supporting hyperplanes that ‘support’ the sets that make up each distinct class. The notes that follow discuss the concepts of supporting and separating hyperplanes and inner products as they relate to support vector machines (SVMs). Using simple examples, much detail is given to the mathematical notation used to represent hyperplanes, as well as how the SVM classification works.

Suggested Citation


Matt Bogard. 2012. "Notes on Support Vector Machines" The Selected Works of Matt Bogard
Available at:

No comments:

Post a Comment