Evaluation of the performance of neural network attachments in the kNN classifier

I solve the classification problem. I am training my uncontrolled neural network for a set of objects (using the skip-gram architecture).

The way I evaluate is to search for k nearest neighbors for each point of validation data, from training data. I take the weighted sum (weights based on distance) of the nearest neighbor labels and use this estimate of each point of the validation data.

Observation . As I increase the number of eras ( model1 - 600 eras, model 2 - 1400 eras and model 3 - 2000 eras), my AUC improves with lower k values, but saturates with the same values.

What could be a possible explanation for this behavior?

enter image description here

[Sent from CrossValidated]

+5
source share
1 answer

To cross-check if the imbalance classes are not, try installing the SVM model. If this gives a better classification (perhaps if your ANN is not very deep), we can conclude that classes should be balanced first.

Also, try some kernel functions to see if this conversion makes linearly shared data?

0
source

Source: https://habr.com/ru/post/1242353/


All Articles