# Simple Neural Nets for Logical Functions

One of the first challenges for any computational architecture is to show that it can simulate the simple logical functions of 'And', 'Or' and the others. These functions are defined by 'truth tables': a table that relates possible inputs to outputs. Take, for example, the sentence:

• Folder A is blue and Folder B is blue.

When is this sentence true? The sentence is true if and only if both Folder A and Folder B are blue. The sentence is false otherwise. Those conditions can be shown in the following table:

Folder A is blue Folder B is blue Truth Value of Sentence (1)
True True True
False True False
True False False
False False False

To simplify and generalize, we will use '1' to symbolize 'True' and '0' to symbolize 'False'. Second, the logical relation shown in this truth table will hold not just for sentences about blue folders, but about any sentence that is a conjunction — any sentence of the form 'a AND b'. We can re-write our truth table thus:

a b a & b
1 1 1
0 1 0
1 0 0
0 0 0

The challenge, then, is to create a neural network that will produce a '1' when the inputs are both '1', and a 'zero' otherwise. The following neural network does just that:

'And' Gate

The network produces an active node at the end if and only if both of the input nodes are active. Remember that if the output node turns red, it is producing an output of '1'. If it stays black, it's output is '0'. The number on the output node is it's threshold not its level of activation.

What about the sentence 'a OR b'? It's truth table is given here:

a b a | b
1 1 1
0 1 1
1 0 1
0 0 0

And it can be simulated by the following neural network:

'Or' Gate

The network produces an active node at the end if one of the input nodes is active.

There are other logical relations of interest, for example, we might want a network that produces an output if and only if a majority of the input nodes are active. This network does exactly that:

'Majority' Gate

The network produces an active node at the end if and only if a majority of the input nodes are active.

Now let us try to build a network that will simulate the following truth table:

a b a * b
1 1 0
0 1 0
1 0 1
0 0 0

In order to build such a network, we would have to make a system that will turn off the output node if one input node produces 0, and the other produces 1. But if the situation is reversed, the output node must take a value of 1. With our current set of tools, this is impossible. We need to introduce a new kind of connection: an inhibitory connection. An inhibitory connection has a weight of -1. We represent inhibitory connections by removing the arrow head, and placing a circle on the end of the connection:

Inverse Conditional

In this network, the output node becomes active on only one occasion: when the top node is active. All other possible combinations of input keep the output node inactive. And this is exactly the truth table that we sought to model. Use the following simulator to create and test your own simple neural networks: