当前位置:网站首页>Related knowledge of libsvm support vector machine

Related knowledge of libsvm support vector machine

2022-06-26 14:12:00 Orange tea must be ^ -^

https://www.esat.kuleuven.be/sista/lssvmlab/

 

You can use the official website above to download LS-SVM, And put the downloaded file in matlab Medium toolbox Under the folder , And add the path , Available for use SVM The toolbox .matlab The built-in tool box can only be used for classification , Cannot be used to predict .

Support vector machine (SVM) It's a two category model , among libsvm Now it can also be used to predict , I mainly use it for forecasting purposes .

among LS-SVM Least squares support vector machine , yes SVM An improvement of , Can be used to predict , and libsvm It is developed by professors of National Taiwan University and can be used for classification , Prediction, etc , In use libsvm I want to download it VS To run , because libsvm At first it was in C++ Developed under the program , So you need an editor .

gam choice RBF Function as kernel after , This function has a parameter , Related to Gaussian function ( Normal distribution ),sig2 Also for the RBF( Radial basis ) Function is a kernel function with its own parameters , The size can be set by yourself

RBF The relevant formula of the function is as follows :

If gamma Set too big ,sgm(σ) It will be very small ,sgm(σ) Very small Gaussian distributions grow tall and thin ,  It will only act near the support vector samples , For unknown samples, the classification effect is very poor , If it's too small , Will cause too much smoothing effect , Can't get a particularly high accuracy rate on the training set , It will also affect the accuracy of the test set .

In conclusion, it is :

gamma The bigger it is , The higher the dimension of the mapping , The better the result of the training , The less support vector , But the more likely it is to cause over fitting , That is, the generalization ability is low .

gamma The smaller the value. , The more support vectors . The number of support vectors affects the speed of training and prediction , And it will also affect the accuracy .

( Over fitting , Under fitting phenomenon )

C It's the punishment factor , Determines how much you value the loss of outliers , Some data are interpreted as the tolerance of errors , In fact, the error mentioned in this sentence is not an error . The specific effect is :

c The bigger it is , The stricter the classification , There can be no mistakes , Easy to overfit ;

c The smaller it is , It means greater error tolerance , Easy under fitting .

The steps for forecasting are the same as BPNN be similar , All are

1、 Import data matlab in

2、 Divide the input data into training data , Test data , And normalize the data

3、 Set parameters , If you use RBF, Set up gam,sig2,C Equal parameter

4、 Function declaration , And the determination of kernel function , Training , Fixed formula for training :

use trainlssvm() Function to train the training data ,
[alpha,b] = trainlssvm({train_Input',train_Output',type,gam,sig,'RBF_kernel'});
SVMtrain_Output = simlssvm({train_Input',train_Output',type,gam,sig,'RBF_kernel','preprocess'},{alpha,b},train_Input');


 

原网站

版权声明
本文为[Orange tea must be ^ -^]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202170509534215.html