Volume 10, Issue 1 Articles Trott's Corner New Products New Publications Calendar News Bulletins New Resources Classifieds Download This Issue Editorial Policy Staff and Contributors Submissions Subscriptions Advertising Back Issues Contact Information 
A Flexible Implementation for Support Vector Machines
Regression Analysis with SVMsSo far we have considered SVMs as a tool for pattern recognition only. It is also possible to use the SVM framework for regression problems. Consider a function to be approximated; for example, a quadratic.
We can adapt the SVM method to the regression setting by using a insensitive loss function
where is the SVM approximation to the regression function . This loss function determines how much a deviation from the true is penalized; for deviations less than , no penalty is incurred. Here is what the loss function looks like.
Using this idea, the regression problem is transformed to a classification problem: any such that may be considered "correctly classified." MathSVM solves such problems using the RegressionSVM function, parameterized by and a penalty constant . Here we again try a polynomial kernel.
The function RegressionSVMPlot provides convenient plotting of the resulting regression function. As with SVMPlot, the kernel type used is supplied as a parameter. Note how support vectors in this case are chosen as the data points that are furthest away from the regression line.
We can, of course, also obtain the analytical expression of the estimated regression function.
TwoDimensional ExampleWe can use SVM regression with domains of any dimension (that is the main advantage). Here is a simple twodimensional example.
Here is the regression function.
There are no specialized 3D plots for regression in the MathSVM package. Here is the usual Plot3D visualization.


About Mathematica  Download Mathematica Player © Wolfram Media, Inc. All rights reserved. 