350 rub
Journal Radioengineering №2 for 2013 г.
Article in number:
Selection of kernel parameters and parameter method for nonlinear classifiers
Authors:
L.I. Dvoiris, V.A. Geraschenkov
Abstract:
In article «Selection of kernel parameters and parameter method for nonlinear classifiers». The results of theoretical investigations of the generalization capability of learning algorithms and, in particular, machine support vectors. The possibility of choosing the parameters of the method and the kernel for the signaling features in the time and frequency domains. The article describes of defining the optimal values of these parameters, namely, search for the nominal error of nonlinear classifier on a grid of parameter values in the range of 2-11 до 211 in increments of power for radial-basis kernel RBF. For numerical experiments were selected five non-linear algorithms of the SVM: classical SVM, lagrangian SVM, smoothed SVM, proximal SVM and potential SVM. Obtained results of the search for optimal parameter values are given in the article. Adjusting the appropriate kernel parameters and method parameters for each classification algorithm, we can achieve the minimization of errors of the 1st and 2nd kind. Computational experiments were carried out in an environment Matlab.
Pages: 83-86
References
  1. Vapnik V. Statistical Learning Theory. John Wiley and Sons. New York. 1998.
  2. Vapnik V., Chapelle O. Bounds on error expectation for support vector machines // Neural Computation. 2000. V. 12. № 9. P. 2013-2036.
  3. Algoritmy mashinnogo obuchenija v zadachakh obnaruzhenija i raspoznavanija: otchet o NIR. SHifr «OBRAZ-2». Tom 2 / nauch. ruk. L.I. Dvojjris; otv. ispoln. V.A. Gerashhenkov. Kaliningrad. 2010. 143 s.
  4. A Novel SVM Algorithm and Experiment Hangzhou, Zhejiang China March 23-March 25 2012 International Conference on Computer Science and Electronics Engineering.