On a Kernel-Based Method for Pattern Recognition, Regression, Approximation, and Operator Inversion |
| |
Authors: | A J Smola B Schölkopf |
| |
Affiliation: | (1) GMD FIRST, Rudower Chaussee 5, 12489 Berlin, Germany. smola@first.gmd.de., DE;(2) Max Planck Institut für biologische Kybernetik, Spemannstrasse 38, 72076 Tübingen, Germany. bs@mpik-tueb.mpg.de., DE |
| |
Abstract: | We present a kernel-based framework for pattern recognition, regression estimation, function approximation, and multiple
operator inversion. Adopting a regularization-theoretic framework, the above are formulated as constrained optimization problems.
Previous approaches such as ridge regression, support vector methods, and regularization networks are included as special
cases. We show connections between the cost function and some properties up to now believed to apply to support vector machines
only. For appropriately chosen cost functions, the optimal solution of all the problems described above can be found by solving
a simple quadratic programming problem.
Received January 31, 1997; revised June 1, 1997, and July 7, 1997. |
| |
Keywords: | , Kernels, Support vector machines, Regularization, Inverse problems, Regression, Pattern Recognition, |
本文献已被 SpringerLink 等数据库收录! |
|