Index indices must be either natural integers or booleans in svmclassify matlab

I use the matlab svm classification function. My train and test data are as follows:

>> size(TrainV) ans = 99192 705 >> size(TestV) ans = 246 705 

I have a function that teaches one-versus-one classification with 10 classes (45 binary classifiers). The model can be trained by calling the function below:

 Models = SVM_multitrain (TrainV(:, 2:end), TrainV(:, 1), 10); 

I send feature vectors (TrainV (:, 2: end)) and labels (TrainV (:, 1)), and I ask Models to train a combination of pairs for 45 classifiers (10). The function works fine, and after training I can get the following information. For example, I will show models for the 3rd and 45th binary classifiers.

 > Models(3) ans = SupportVectors: [9x704 double] Alpha: [9x1 double] Bias: -2.3927 - 0.0001i KernelFunction: @linear_kernel KernelFunctionArgs: {} GroupNames: [20117x1 double] SupportVectorIndices: [9x1 double] ScaleData: [1x1 struct] FigureHandles: [] >> Models(45) ans = SupportVectors: [10x704 double] Alpha: [10x1 double] Bias: -2.7245 + 0.0000i KernelFunction: @linear_kernel KernelFunctionArgs: {} GroupNames: [22087x1 double] SupportVectorIndices: [10x1 double] ScaleData: [1x1 struct] FigureHandles: [] 

The problem is that I call a function to classify a vector function, for example, for the first binary classifier.

 >> TestAttribBin = svmclassify(Models(1), TestV(:,2:end)) Subscript indices must either be real positive integers or logicals. Error in svmclassify (line 140) outclass = glevels(outclass(~unClassified),:); 

What could be the problem? when I apply the same classification procedure to highlight vectors extracted in another way, this problem does not occur.

+5
source share
1 answer

The likely cause of this error is the transfer of complex data to svmclassify . svmclassify accepts only real feature vectors. Indeed, passing complex data to svmclassify causes outclass be complex and complex values ​​cannot be used for indexing, as indicated by the error message.

One option may be to encode the imaginary part of your vector into a function, for example, by doubling the length of the object vectors.

In fact, the vast majority of machine learning models are based on the assumption that feature vectors are real, for example. artificial neural network, regression trees, svm, etc. Although in some cases there may be some extensions.

+5
source

Source: https://habr.com/ru/post/1235803/


All Articles