Simple

mathematical model for a

neuron.

Invented by Rosenblatt in the

1960s.

A perceptron is a

function f:

**R**^{n} ->

**R** defined by 2 functions g:

**R**^{n} ->

**R** and t:

**R** ->

**R** per f(x) = t(g(x)).

**R** is the

set of

real numbers.

g is called the summation function and t is called the activation/transfer function.

Usually g(x) is the inner product of x with a weight vector w minus a real threshold value theta; g(x) = < x,w> - threshold

t can be any

monotonic function, but usually

boundedness is demanded. Very often the

sign function is used.

Perceptrons were believed to be very powerful in the 1960s, until Minsky and Papert proved in their book "Perceptrons" that a single perceptron (with the sign function) can't solve the XOR problem.

Note that often authors use the above example for g, the sign function for t and regard other functions for t as non-perceptrons.