Multilayer Perceptrons in EmguCV

I am trying to implement Multi-Layer Perceptrons (MLP) neural networks using EmguCV 3.1 (dot NET shell for OpenCV library) in C # (Windows Form). To train with this library, I decided to implement the operation ORusing MLP.

I create an MLP using the Initialize method and study it using the Train method, as shown below:

private void Initialize()
{
    NETWORK.SetActivationFunction(
    ANN_MLP.AnnMlpActivationFunction.SigmoidSym);

    NETWORK.SetTrainMethod(ANN_MLP.AnnMlpTrainMethod.Backprop);

    Matrix<double> layers = new Matrix<double>(new Size(4, 1));
    layers[0, 0] = 2;
    layers[0, 1] = 2;
    layers[0, 2] = 2;
    layers[0, 3] = 1;
    NETWORK.SetLayerSizes(layers);
}

private void Train()
{
    // providing data for input

    Matrix<float> input = new Matrix<float>(4, 2);
    input[0, 0] = MIN_ACTIVATION_FUNCTION; input[0, 1] = MIN_ACTIVATION_FUNCTION;
    input[1, 0] = MIN_ACTIVATION_FUNCTION; input[1, 1] = MAX_ACTIVATION_FUNCTION;
    input[2, 0] = MAX_ACTIVATION_FUNCTION; input[2, 1] = MIN_ACTIVATION_FUNCTION;
    input[3, 0] = MAX_ACTIVATION_FUNCTION; input[3, 1] = MAX_ACTIVATION_FUNCTION;

    //providing data for output
    Matrix<float> output = new Matrix<float>(4, 1);
    output[0, 0] = MIN_ACTIVATION_FUNCTION;
    output[1, 0] = MAX_ACTIVATION_FUNCTION;
    output[2, 0] = MAX_ACTIVATION_FUNCTION;
    output[3, 0] = MAX_ACTIVATION_FUNCTION;


    // mixing input and output for training
    TrainData mixedData = new TrainData(
        input,
        Emgu.CV.ML.MlEnum.DataLayoutType.RowSample,
        output);

    // stop condition = 1 million iterations
    NETWORK.TermCriteria = new MCvTermCriteria(1000000);

    // training
    NETWORK.Train(mixedData);
}

Where MIN_ACTIVATION_FUNCTIONand MAX_ACTIVATION_FUNCTION-1.7159 and 1.7159 are equal, respectively ( according to the OpenCV documentation ). After 1,000,000 iterations (as you see in my code in a stopped state), I test my network for prediction using the Predict method, as shown below:

private void Predict()
{
    Matrix<float> input = new Matrix<float>(1, 2);
    input[0, 0] = MIN_ACTIVATION_FUNCTION;
    input[0, 1] = MIN_ACTIVATION_FUNCTION;

    Matrix<float> output = new Matrix<float>(1, 1);

    NETWORK.Predict(input, output);
    MessageBox.Show(output[0, 0].ToString());

    //////////////////////////////////////////////

    input[0, 0] = MIN_ACTIVATION_FUNCTION;
    input[0, 1] = MAX_ACTIVATION_FUNCTION;

    NETWORK.Predict(input, output);
    MessageBox.Show(output[0, 0].ToString());

    //////////////////////////////////////////////

    input[0, 0] = MAX_ACTIVATION_FUNCTION;
    input[0, 1] = MIN_ACTIVATION_FUNCTION;

    NETWORK.Predict(input, output);
    MessageBox.Show(output[0, 0].ToString());

    ////////////////////////////////////////////////

    input[0, 0] = MAX_ACTIVATION_FUNCTION;
    input[0, 1] = MAX_ACTIVATION_FUNCTION;

    NETWORK.Predict(input, output);
    MessageBox.Show(output[0, 0].ToString());
}

NETWORK:
-0,00734469
-0,03184918
0.02080269
-0,006674092

- :
-1,7
+1,7
+1,7
+1,7


?

, 0, 1 MIN_ACTIVATION_FUNCTION MAX_ACTIVATION_FUNCTION, .

1: , ( , ). NaN predict.

+6
2

EmguCV (Emgu.CV-3.1.0-r16.12) ​​ 3.1.0 Emgu.CV-3.1.0-r16.12. , .

0

, . output input.

, 2D- ( ). 2 , 2 , (1, 0) is class "True" (0, 1) is class "False". . OR , .. .

0

Source: https://habr.com/ru/post/1016529/


All Articles