DIMACS TR: 95-35
Constructive Training Methods for Feedforward Neural Networks
with Binary Weights
Authors: Eddy Mayoraz, Frederic Aviolat
ABSTRACT
Quantization of the parameters of a Perceptron is a central problem in
hardware implementation of neural networks using a numerical technology. A
neural model with each weight limited to a small integer range will require
little surface of silicon. Moreover, according to Ockham's razor principle,
better generalization abilities can be expected from a simpler computational
model. The price to pay for these benefits lies in the difficulty to train
these kind of networks. This paper proposes essentially two new ideas for
constructive training algorithms, and demonstrates their efficiency for the
generation of feedforward networks composed of Boolean threshold gates with
discrete weights. A proof of the convergence of these algorithms is given.
Some numerical experiments have been carried out and the results are presented
in terms of the size of the generated networks and of their generalization
abilities.
Paper available at:
ftp://dimacs.rutgers.edu/pub/dimacs/TechnicalReports/TechReports/1995/95-35.ps.gz
DIMACS Home Page