Why is a neural bias system necessary for backpropagating a neural network that the XOR operator recognizes?

I posted yesterday a question about the problems that I had with my backpropagating neural network for the XOR operator. I did a little more work and realized that this could be due to the lack of neural bias.

My question is, what is the role of the neural syndrome as a whole and what is its role in the backpropagating neural network that the XOR operator recognizes? Is it possible to create one without neural bias?

+6
source share
1 answer

You can create a neural network without a neural system ... this will work fine, but for more information, I would recommend that you see the answers to this question:

The role of bias in neural networks

Update: The role of a neural nerve in a neural network that is trying to "hit" the XOR model is to minimize the size of the neural network. Usually for a "primitive" (not sure if this is the right term) logic functions, such as AND , OR , NAND , etc., you try to create a neural network with 2 input neurons, 2 hidden neurons and 1 output neuron. This is not possible for XOR , because the easiest way to simulate XOR is with two NAND s:

enter image description here

You can consider A and B as your input neurons, the gate in the middle is your β€œbias” neuron, the next two gates are your β€œhidden” neurons, and finally you have an output neuron. You can solve XOR without a neural bias, but for this you will need to increase the number of hidden neurons to a minimum of 3 hidden neurons. In this case, the 3rd neuron essentially acts as a neural syndrome. Here is another question that a neuron with bias discusses regarding the XOR : XOR problem , solvable with a 2x2x1 neural network without bias?

+6
source

Source: https://habr.com/ru/post/900951/


All Articles