How are neurons in the deeper layers capable of making more complex decisions than neurons in the shallower / earlier layers?

I am new to ML and am reading an online book at http://neuralnetworksanddeeplearning.com .

In the first chapter, the author describes one perceptron, using the cheese festival example. It mainly illustrates an example of a perceptron trying to decide whether to go to the cheese festival this weekend. The perceptron has 3 inputs (weather factor, social factor and transport factor). And although I fully understand the 100% authoring example of the Cheese Festival for a strong perceptron , I do not understand the following (seemingly illogical) conclusion that he draws, where he develops an example that contains several perceptrons organized in 2 layers:

"What about the perceptrons in the second layer? Each of these perceptrons makes a decision by weighing the results of the first decision level. Thus, the perceptron in the second layer can make decisions at a more complex and more abstract level than the perceptrons in the first layer ... Thus “A multilayer network of perceptrons can be involved in complex decision making.”

I do not understand: why are second-level perceptrons capable of “making decisions at a more complex and abstract level” than their colleagues in the 1st layer?

Can someone provide an example, perhaps using the cheese festival example already started by this author? What is an example of a more complex / abstract decision that can be made by the perceptron in layer 2?


Example

enter image description here

In the upper part of the diagram (above) is shown the perceptron making decisions, following the example of the author of the Cheese Festival. A single perceptron accepts 3 inputs / factors: weather, social and transport factors. Of these, he calculates / determines whether to go to the cheese festival or not.

The bottom of the diagram above (that is, under the horizontal line) perfectly illustrates where my mental freeze is. Here we have 3 neurons in the 1st power layer in one layer 2 node:

  • A "I have to go to the cheese festival" node: this is the same node as in the upper half of the diagram; and
  • A "I have to go to the milk festival" node: omitted from the diagram for simplicity / brevity
  • A "I have to go to the Bacon Festival" node: also omitted for simplicity / brevity

The outputs (Yes / No solutions) from these 3 perceptrons are fed as input to a single level 2 node. Given my example here, what is an example of a level 2 node solution that illustrates this notion of “object hierarchy” and / or the author’s claim that deeper layers can make more complex / abstract decisions?

+5
source share
2 answers

The basic idea of ​​multilayer networks is that each layer adds some abstraction. The 2nd layer itself has the same structure as the 1st level, so it cannot make more complex decisions on its own. But it can be based on abstractions (outputs) created by the 1st layer.

I like to add that this statement is quite idealized. In practice, it is often difficult to understand what exactly the inner layers do.

In your example, the decision made by the 2nd layer of the node might be "Should I go to the festival?"

+3
source

There is a visualization of the layers between them in a deep network. http://blog.keras.io/how-convolutional-neural-networks-see-the-world.html

In a nutshell, each new layer can recognize more complex patterns by looking for a pattern in previous layers, and the latter has many forms of the desired classes (antz, flowers, cars, etc.). Example:

  • detection of the first layers, textures - material that you can use with photoshop filters.
  • second can detect simple shapes based on edges (squares, circles, lines, etc.) and more complex textures.
  • more complex shapes based on the shapes from step 2
  • detect forms closest to the target classes.

This also explains why using suitable models (VGG16 / 18, AlexNet, etc.) and switching the upper levels makes sense. Instead of spending weeks learning full no one.

0
source

Source: https://habr.com/ru/post/1246125/


All Articles