URL http://www.cs.hmc.edu/~keller/courses/cs152/f97
Link to Feedback Form 
Textbook errata
Neural Networks
Link to Lecture Slides
Links to Student Projects (partial)
Trailer
Can a computer be taught to read words aloud,
recognize faces, perform a medical diagnosis,
drive a car, play a game, balance a pole, predict physical phenomena?
The answer to all these is yes. All these applications and others
have been demonstrated using
varieties of the computational model known as "neural networks", the subject of this
course.
The course will develop the theory of a number of neural network models.
Participants will exercise the theory through both predeveloped computer
programs and ones of their own design.
Catalog Description
Modeling, simulation, and analysis of artificial neural networks.
Relationship to biological neural networks. Design and optimization of
discrete and continuous neural networks. Backpropagation, and other gradient
descent methods. Hopfield and Boltzmann networks. Unsupervised learning.
Selforganizing feature maps. Applications chosen from function approximation,
signal processing, control, computer graphics, pattern recognition, timeseries
analysis. Relationship to fuzzy logic, genetic algorithms, and artificial
life.
Prerequisites: CS 60 and Mathematics 73 and 82, or permission of
the instructor. 3 credit hours.
Instructor
 Robert Keller
242 Olin (45 p.m. MTuW or by appt.), keller@turing, x 18483
Texts
Course Requirements
There will be some homework and programming
assignments, but no exams. These assignments will constitute about
50% of the grade. The other 50% of the grade is from a substantial
final project involving either a working neural network application or
a research paper. The grade on the project will be determined by the
comprehensiveness and degree to which you explored competing
approaches. The projects will be presented orally.
Optional
voluntary oral presentations on textbook material
can also be made during the term. These
can act to cushion your grade. They are very much encouraged, as it
they really help you learn the material at a higher level than you
would otherwise.
Please see me if you are interested in
making a presentation.
CS 152 Topic Outline
 Week 1 (read MMR chapter 1)
Introduction
 Contexts for and Motivation Neural Networks:
Artificial Intelligence 
Biological 
Physics
 Artificial Neural Network overview
 Week 2 (read MMR chapter 2)
Supervised Learning: SingleLayer Networks
 Weeks 36 (read MMR chapters 3 and 4)
Supervised Learning: MultiLayer Networks
 MultiLayer Perceptrons (MLPs)
 Backpropagation
 Conjugate Gradient method
 LevenbergMarquardt (LM) method
 Madalines
 RadialBasis Networks
 CascadeCorrelation Networks
 Polynomial Networks
 Recurrent Networks
 Time series
 Backpropagation through time
 Finite Impulse Response (FIR) MLP
 Temporal Differences method (TD)
 Weeks 78 (read MMR chapter 5)
Unsupervised Learning
 Simple Competitive Networks: Winnertakeall  Hamming network
 Learning Vector Quantization (LVQ)
 Counterpropagation Networks (CPN)
 Adaptive Resonance Theory (ART)
 Kohonen SelfOrganizing Maps (SOMs)
 Principal Component Analysis networks (PCA)
 Weeks 910 (read MMR chapter 6)
Associative Models
 Linear Associative Memory (LAM)
 Hopfield Networks
 BrainStateinaBox (BSB)
 Boltzmann Machines and Simulated Annealing
 BiDirectional Associative Memory (BAM)
 Week 11 (read MMR chapter 7)
Optimization Problems
 Neural Network Approaches
 Evolutionary Programming
 Week 12
Fuzzy logic and its connection to NNs

Martin T. Hagan,
Howard B. Demuth,
and Mark Beale,
Neural Network Design,
PWS Publishing Company, Boston, 1996, ISBN 0534943322.
This book was used in the 1996 offering of the course. It has excellent
tutorial content and is strongly keyed to the use of MATLAB. Matrix
formulations are used throughout.

Simon Haykin,
Neural networks  A comprehensive foundation,
Macmillan, 1994.
This book was used in the 1995 offering of the course.
It has a strong mathematical and signalprocessing orientation.
 Mohamad H. Hassoun,
Fundamentals of artificial neural networks,
MIT Press, 1995.
This is another fairly thorough introduction.
 James A. Anderson,
An introduction to neural networks,
MIT Press, 1995.
This is a more gentle introduction to the topic, by one of the pioneers in the field.

Irwin B. Levitan and Leonard K. Kaczmarek,
The Neuron,
Oxford University Press, 1991.
This book focuses on the biology and physics of neurons, if you wish to
know more about this aspect; it will not be emphasized in the course.

Marvin L. Minsky and Seymour Papert,
Perceptrons (expanded addition),
MIT Press, 1988.
The historical importance of this book will be discussed in the course.

Duda and Hart,
Pattern classification and scene analysis,
Wiley, 1972.
This book gives a broad look at pattern classification problems, but is not
on neural nets as such.

Teuvo Kohonen
Selforganizing maps,
SpringerVerlag, 1995.
This is a comprehensive reference by the originator of this concept.

Bart Kosko
Neural networks and fuzzy systems : a dynamical systems approach to machine intelligence,
Prentice Hall, 1992.
This book compares fuzzy and neural approaches to control problems.

Zbigniew Michalewicz,
Genetic Algorithms + Data Structures = Evolution Programs,
Third Edition,
Springer Verlag, 1996.
This book describes the evolutionary approach, which in some cases can achieve
results similar to neural approaches.

John R. Koza,
Genetic Programming,
MIT Press, 1994.
This book focuses on the evolutionary approach to producing programs.

Christopher G. Langton (ed.),
Artificial Life, an Overview,
AddisonWesley, 1989.
This is an early colllection of articles on the topic.

NeuroFuzzy and Soft Computing
Neurocomputing People
Software
Data
Demos