Re: Minsky's Critique of the Perceptron

From: HARNAD Stevan (harnad@cogsci.soton.ac.uk)
Date: Thu Jun 06 1996 - 22:01:40 BST


> Date: Sun, 26 May 1996 14:46:40 +0100 (BST)
> From: "Tattan, Dave" <det195@soton.ac.uk>
>
> The idea of Perceptrons came about after efforts to explain the mind
> using computation were still going on.

Before, actually. It was Minsky's critique of Perceptrons that ushered
in the heyday of Artificial Intelligence, and with it, computationalism:
the mind as a symbol system.

> Perceptrons are a specific model
> of the Neural Net theory,

Perceptrons are a particular, simple, neural net.

> computers use symbol manipulation, while this
> new theory uses interconnecting units which pass activity between
> themselves.There are a number of inputs and one output,

There can be many inputs and many outputs; what's special about the
perceptron it that it only has the input and out put "layers," hence two
layers in all. No "hidden" layers in between.

> closely
> replicating the manner in which the brain presents outputs. The form
> that the output takes has a direct correlation to the combinations of
> inputs and it was soon found that through Supervised Learning this node
> system could learn. This means that by the process of trail and error,
> a correct (incorrect) output strenghtens (weakens) the perceptron
> connection. So if a response is the "right" one then the chances of
> this output being chosen again are relatively higher.

Correct, though, as you will see if you read the Best chapter carefully,
the perceptron learning rule is not the same as that of the prototypical
supervised net: Backprop.

> However Minsky examined this carefully and noted that the system has no
> difficulty in learning some outputs from certain inputs but others it
> constantly gets wrong. The problem is greatly simplified if it is
> stated that there are only two inputs, A and B. It will learn to choose
> the correct output if A is a certain number(n), if B is a certain
> number(n) or if this number(n) is A and B. In maths terms it can learn
> the "AND" Rule and the "INCLUSIVE OR" Rule. Now if the system was asked
> to come up with the correct answer if any one (as apposed to two) have
> the the number(n) it seems to come to a standstill.It cannot learn the
> correct response when one input is different from the other.The
> mathematics student would call this the "EXCLUSIVE OR" Rule. This was
> Minsky Critique of Perceptrons: the brain CAN do "EXCLUSIVE OR",
> therefore this is a flawed theory.

Well, kid sib would have a bit of trouble figuring out what you meant by
that "number(n)" business, and XOR, but I've already explained that a
couple of times in this archive (look them up!), The question here is:
Was Minsky's critique correct? Was it the last word on the subject?
(What about multilayered nets?)

in



This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:45 GMT