求一篇SVM的英文论文
发布网友
发布时间:2022-05-24 14:47
我来回答
共1个回答
热心网友
时间:2023-10-17 14:23
SVM的主要思想可以概括为两点: (1) 它是针对线性可分情况进行分析,对于线性不可分的情况,通过使用非线性映射算法将低维输入空间线性不可分的样本转化为高维特征空间使其线性可分,从而 使得高维特征空间采用线性算法对样本的非线性特征进行线性分析成为可能;(2) 它基于结构风险最小化理论之上在特征空间中建构最优分割超平面,使得学习器得到全局最优化,并且在整个样本空间的期望风险以某个概率满足一定上界。
在学习这种方法时,首先要弄清楚这种方法考虑问题的特点,这就要从线性可分的最简单情况讨论起,在没有弄懂其原理之前,不要急于学习线性不可分等较复杂的情况,支持向量机在设计时,需要用到条件极值问题的求解,因此需用拉格朗日乘子理论,但对多数人来说,以前学到的或常用的是约束条件为等式表示的方式,但在此要用到以不等式作为必须满足的条件,此时只要了解拉格朗日理论的有关结论就行。The main idea of SVM can be summed up in two points: (1) it is linearly separable for analysis, can not be separated for the case of linear, nonlinear mapping algorithm through the use of low-dimensional non-linear input space into sub-samples high-dimensional feature space so that it can be linear, which makes high-dimensional feature space using a linear algorithm to the nonlinear characteristics of the sample linear analysis possible; (2) It is based on the theory of structural risk minimization in the feature space on Construction of the optimal hyperplane partition, making the overall learning device has been optimized and the expectations of the entire sample space to a probability of risk to satisfy a certain upper bound.
In learning this method, the first to consider this approach to clarify the characteristics of the problem, which can be linear from the beginning to discuss the simplest case, in the absence of prior to understand its principle, not to rush to study non-linear classification than the complexity of the situation, support vector machines in the design, the extreme conditions needed to solve the problem, the Lagrange multipliers required theory, but for most people, before learned, or is bound by the conditions commonly used for the equation that way, but in order to take into account inequality of conditions must be met at this time as long as the Lagrangian theory to understand the relevant conclusions on the line.
主要软件包:
Lush -- an Lisp-like interpreted/compiled language with C/C++/Fortran interfaces that has packages to interface to a number of different SVM implementations. Interfaces to LASVM, LIBSVM, mySVM, SVQP, SVQP2 (SVQP3 in future) are available. Leverage these against Lush's other interfaces to machine learning, hidden markov models, numerical libraries (LAPACK, BLAS, GSL), and builtin vector/matrix/tensor engine.
SVMlight -- a popular implementation of the SVM algorithm by Thorsten Joachims; it can be used to solve classification, regression and ranking problems.
LIBSVM -- A Library for Support Vector Machines, Chih-Chung Chang and Chih-Jen Lin
YALE -- a powerful machine learning toolbox containing wrappers for SVMLight, LibSVM, and MySVM in addition to many evaluation and preprocessing methods.
LS-SVMLab - Matlab/C SVM toolbox - well-documented, many features
Gist -- implementation of the SVM algorithm with feature selection.
Weka -- a machine learning toolkit that includes an implementation of an SVM classifier; Weka can be used both interactively though a graphical interface or as a software library. (One of them is called "SMO". In the GUI Weka explorer, it is under the "classify" tab if you "Choose" an algorithm.)
OSU SVM - Matlab implementation based on LIBSVM
Torch - C++ machine learning library with SVM
Shogun - Large Scale Machine Learning Toolbox with interfaces to Octave, Matlab, Python, R
Spider - Machine learning library for Matlab
kernlab - Kernel-based Machine Learning library for R
e1071 - Machine learning library for R
SimpleSVM - SimpleSVM toolbox for Matlab
SVM and Kernel Methods Matlab Toolbox
PCP -- C program for supervised pattern classification. Includes LIBSVM wrapper.
TinySVM -- a small SVM implementation, written in C++