Scikit-learn, linearsvc - how to get auxiliary vectors from trained SVM?

I use LinearSVC from the scikit-learn library, and I wonder if it is possible to somehow pull out the vectors that my model uses after training to make predictions. I tried to do this for some time, but without luck. Somebody knows?

+6
source share
3 answers

Unfortunately, there seems to be no way to do this. LinearSVC calls liblinear ( see related code ), but does not extract vectors, but only coefficients and intercepts.

One option would be to use SVC with a "linear" kernel (libsvm instead of liblinear based), but also supports poly , dbf and sigmoid :

 from sklearn import svm X = [[0, 0], [1, 1]] y = [0, 1] clf = svm.SVC(kernel='linear') clf.fit(X, y) print clf.support_vectors_ 

Output:

 [[ 0. 0.] [ 1. 1.]] 

liblinear scales better for a large number of samples, but otherwise they are mostly equivalent.

+4
source

It can help you.

 clf = svm.SVC( kernel='rbf',C=0.05) clf.fit(traindata,y) print clf.support_vectors_ 

This link may contain additional information. http://scikit-learn.org/stable/modules/svm.html

+3
source

I'm not sure if this helps, but I was looking for something similar, and it was concluded that if:

 clf = svm.LinearSVC() 

Then this:

 clf.decision_function(x) 

It is equal to:

 clf.cof_.dot(x) + clf.intercept_ 
+2
source

Source: https://habr.com/ru/post/976444/


All Articles