What does the "relu" mean in tf.nn.relu?

The API documentation says: "Computes a straightened linear value."

Is Re (ctified) L (inear) ... what is U then?

+4
source share
2 answers

Re (ctified) L (inear) (U) nit

Usually a layer in a neural network has some input, say a vector, and multiplies it by a weight matrix, resulting in again in a vector.

( , float) . , , , , , , . .

, 0, , , 0 . ( " " ) relu.

+11

Friesel Relu.

  • .

Relu graph: , .

enter image description here

f(x) = max(0,x) .

  1. ReLU ! , x(1-x).

    1, x > 0

    0

, . , backpropagation!

+4

Source: https://habr.com/ru/post/1675206/


All Articles