Neural Network Zero Accuracy Training at Keras

I am coaching the Regression Problem Neural Network at Keras. Why is the output only one measurement, does the accuracy in each Epoch always show acc: 0.0000e + 00?

in the following way:

1000/199873 [..............................] - ETA: 5 s - loss: 0.0057 - acc: 0 , 0000 e + 00

2000/199873 [..............................] - ETA: 4s - loss: 0.0058 - acc: 0, 0000 e + 00

3000/199873 [..............................] - ETA: 3s - loss: 0.0057 - acc: 0, 0000 e + 00

4000/199873 [..............................] - ETA: 3s - loss: 0.0060 - acc: 0.0000e + 00 ...

198000/199873 [=============================.]] - ETA: 0s - loss: 0,0055 - according to: 0, 0000 e + 00

199000/199873 [=============================.]] - ETA: 0s - loss: 0,0055 - according to: 0, 0000 e + 00

199873/199873 [===============================] - 4s - losses: 0,0055 - according to: 0.0000e + 00 - val_loss: 0.0180 - val_acc: 0.0000e + 00

Age 50/50

But if the output is two measurements or higher, no problem for accuracy.

My model as below: `

input_dim = 14 batch_size = 1000 nb_epoch = 50 lrelu = LeakyReLU(alpha = 0.1) model = Sequential() model.add(Dense(126, input_dim=input_dim)) #Dense(output_dim(also hidden wight), input_dim = input_dim) model.add(lrelu) #Activation model.add(Dense(252)) model.add(lrelu) model.add(Dense(1)) model.add(Activation('linear')) model.compile(loss= 'mean_squared_error', optimizer='Adam', metrics=['accuracy']) model.summary() history = model.fit(X_train_1, y_train_1[:,0:1], batch_size=batch_size, nb_epoch=nb_epoch, verbose=1, validation_split=0.2) loss = history.history.get('loss') acc = history.history.get('acc') val_loss = history.history.get('val_loss') val_acc = history.history.get('val_acc') '''saving model''' from keras.models import load_model model.save('XXXXX') del model '''loading model''' model = load_model('XXXXX') '''prediction''' pred = model.predict(X_train_1, batch_size, verbose=1) ans = [np.argmax(r) for r in y_train_1[:,0:1]] 
+9
source share
3 answers

The problem is that your final model output has linear activation, which makes the model a regression, not a classification problem. “Accuracy” is defined when the model classifies the data correctly according to the class, but “accuracy” is not effectively defined for the regression problem due to its continuous property.

Either get rid of accuracy as metrics and go to full regression, or put your problem in the classification problem using loss='categorical_crossentropy' and activation='softmax' .

This is a similar problem with yours: Link

For more information, see StackExchange .

+10
source

I'm not sure what the problem is, but your model looks a little strange to me.

This is your model:

 lrelu = LeakyReLU(alpha = 0.1) model = Sequential() model.add(Dense(126, input_dim=15)) #Dense(output_dim(also hidden wight), input_dim = input_dim) model.add(lrelu) #Activation model.add(Dense(252)) model.add(lrelu) model.add(Dense(1)) model.add(Activation('linear')) 

and visualization of your model is shown below:

enter image description here

There are two layers that can be the output layer of your model, and you have not decided which one is your current output level. I assume that the reason you cannot make a correct prediction.

If you want to implement your model as follows,

enter image description here

you must add your activation level yourself, and not use the same one.

For instance,

 model = Sequential() model.add(Dense(126, input_dim=15)) #Dense(output_dim(also hidden wight), input_dim = input_dim) model.add(LeakyReLU(alpha = 0.1)) #Activation model.add(Dense(252)) model.add(LeakyReLU(alpha = 0.1)) model.add(Dense(1)) model.add(Activation('linear')) 
+1
source

Just a quick complement to already posted excellent answers.

The following snippet is a custom metric that displays the average percentage difference between your NN forecast and the actual value.

 def percentage_difference(y_true, y_pred): return K.mean(abs(y_pred/y_true - 1) * 100) 

to embed it in your metrics, just add it to the “metrics” option when compiling the model. Those.

 model.compile(loss= 'mean_squared_error', optimizer='Adam', metrics=['accuracy',percentage_difference]) 
+1
source

Source: https://habr.com/ru/post/1263230/


All Articles