This is my annotated list of the best introductory (but complete) resources on ANN (aka MLP).
Course Notebook, Willamette University This resource is about 25-30 pages (go between sections using the sidebar in the upper right corner or by clicking the next or previous button at the bottom of this page). I appreciate this resource so much for two reasons: (i) it contains a large number of diagrams and integrates them perfectly with prose; and (ii) it is comprehensive — the network architecture, the calculus underlying back propagation, the selection / repetition of learning parameters (momentum, learning speed), etc. > the section on backpropagation (the computational and programmatic essence of ANN / MLP) is especially good - combining prose, network diagrams, and actual equations, he carefully explains each step in one era, which includes the learning phase.
DeveloperWorks neural network developer class This one, David Merz, contains working code (python) and a nontrivial dataset. With the exception of the first paragraphs, the entire document closely follows the code and data. This is very important for me, because if I can’t encode it, I don’t think I found out about it (that a personal threshold may not work for others). In addition, the focus on the code and the interaction of this code with the provided data set keeps the discussion practically reasonable. Finally, authors David Merz and Andrew Blays obviously have a strong team in this matter.
Generation5 This is the easiest of the three, and perhaps the one to start with. The author, obviously, is well aware of the trifles of multilayer perceptrons, but does not require such reader knowledge, i.e. Explains backpropagation as a solution to common sense to the problem, without delaying to methods of numerical solution (which is common in many MLP references - "backprop is solved using gradient descent"). And, like the first resource that I mentioned, it relies mainly on diagrams. Instead of data (0 and 1), the author discusses the structure and function of MLP in the context of a simple predictive analytical scenario, using prose, and not reducing the problem to numerical data.
source share