Neural Networks
- Neural Networks (NNs) also known as Artificial Neural Networks (ANNs),
Connectionist Models, and Parallel Distributed Processing (PDP) Models
- "'Artificial Neural Networks' are massively parallel interconnected
networks of simple (usually adaptive) elements and their hierarchical
organizations which are intended to interact with the objects of the
real world in the same way as biological nervous systems do." -- T. Kohonen
- Fine-grained, parallel, distributed computing model characterized by
- A large number of very simple, neuron-like processing elements
called units, PEs, or nodes
- A large number of weighted, directed connections between pairs of units
- Weights may be positive or negative real values
- Local processing in that each unit computes a function based on
the outputs of a limited number of other units in the network
- Each unit computes a simple function of its input values,
which are the weighted outputs from other units. If there are
n inputs to a unit, then the unit's output, or activation
is defined by
a = g((w1 * x1) + (w2 * x2) + ... + (wn * xn))
Thus each unit computes a (simple) function
g of the linear combination of its inputs.