Online Artificial Neural Net
Author: Peter Bradley

Overview:

[No_Image]
MODULE DESCRIPTION:

In this module we construct neural networks to model simple binary logical functions. We begin with feed-forward networks, encounter the X-OR problem, and solve it by introducing the concept of backpropagation. The module includes a working backpropagating neural net capable of solving any binary logical function.

MODULE COMPONENTS:

Neural Networks

In this module, we construct neural networks to model simple binary logical functions. We begin with feed-forward networks, encounter the X-OR problem, and solve it by introducing the concept of backpropagation. The module includes a working backpropagating neural net capable of solving any binary logical function.

  • Introduction to Neural Networks

    Neural network models are inspired by the fact that only known realizer of cognition - the human brain - is composed of massive numbers of small units that are connected together in interesting ways. In this module, we discover the basic structure of neural networks, and how these simple networks are able to realize basic computational processes.

  • Simple Neural Nets for Logical Functions

    Simple feed-forward neural nets can be arranged to model a number of different simple logical functions, such as 'and', 'or' and 'majority'. In this module, you'll have a chance to create such networks to solve functions of your choice.

  • The XOR Problem and Solution

    Simple networks have two drawbacks: they depend on architectural design set by a programmer, and they cannot solve for discontinuous functions like 'XOR'. In this module, we introduce networks that can correct their own architecture through backpropagating error signals.

Inquiry Project

This module was produced by the Inquiry project that was begun at Washington University by a team lead by Bill Bechtel and is now hosted at McDaniel College by Peter Bradley. It is available at their website.