Rule delta versus gradient descent?

What is the difference between gradient descent and delta rule?

+4
source share
1 answer

Without math: the delta rule uses gradient descent to minimize the error from the weight of the perceptron network.

Gradient descent is a general algorithm that gradually changes the vector of parameters to minimize the objective function. He does this by moving in the direction of least resistance, i.e. Directions having the largest (negative) gradient.

You find this direction by taking the derivative of the objective function. It is like dropping marble into a smooth, hilly landscape. This guarantees only a local minimum. Thus, the short answer is that the delta rule is a specific algorithm using a common gradient descent algorithm.

+7
source

Source: https://habr.com/ru/post/1338381/


All Articles