URL http://www.cs.hmc.edu/courses/2003/fall/cs152/index.html
Projects | Slides | NND Slides | Links | Some Previous Projects | Matlab Toolbox

Harvey Mudd College Fall 2003

Computer Science 152

Neural Networks

Trailer

Can a computer read words aloud, recognize faces, perform a medical diagnosis, drive a car, play a game, balance a pole, predict physical phenomena? Can it be taught to do these things?

The answer to all these is yes. All these applications and others have been demonstrated using varieties of the computational model known as "neural networks", the subject of this course.

The course will develop the theory of a number of neural network models. Participants will exercise the theory through both pre-developed computer programs and ones of their own design.

Catalog Description

Modeling, simulation, and analysis of artificial neural networks. Relationship to biological neural networks. Design and optimization of discrete and continuous neural networks. Backpropagation, and other gradient descent methods. Hopfield and Boltzmann networks. Unsupervised learning. Self-organizing feature maps. Applications chosen from function approximation, signal processing, control, computer graphics, pattern recognition, time-series analysis. Relationship to fuzzy logic, genetic algorithms, and artificial life.

Prerequisites: CS 60 (Principles of Computer Science) and Mathematics 12 (Calculus and Linear Algebra), or permission of the instructor. 3 credit hours.

Instructor

Texts

Course Requirements

There will be some homework and programming assignments, but probably no exams. The assignments will constitute about 40% of your grade. 40% of your grade is from a substantial final project involving either creation of a working neural network application or a research paper. The grade on the project will be determined by the comprehensiveness and degree to which you explored competing approaches. The projects will be presented orally. 10% of your grade will be based on a preliminary presentation you make, ideally on material related to the your project. Finally, 10% will be based on general participation, which includes attendance.

CS 152 Topic Outline

This outline gives the sequence of topics as they relate to NND. However I am going to try to compress things a bit to allow space for a few more topics at the end. Also, some topics listed are not covered in NND. So all this is subject to change as we move forward.


First Assignment, Due Tues. 9/9/03

For the first two problems, please work the indicated exercises in NND. Turn in your solutions neatly printed, written, or, preferrably, typed. The third problem is a simple programming problem, but start early anyway.

  1. NND: E4.1

  2. NND: E4.5

  3. Create a running prototype program demonstrating the perceptron learning algorithm, using any language you choose. The idea is to learn from using, as well as constructing, the program.

    Your program should:

    • Accept a set of data points, each point consisting of
      • an input vector
      • an associated scalar desired response value.

    • Accept a learning rate LR.

    • Accept a limit on the number of iterations, in case the data points happen not to be linearly separable, or that convergence happens to take a long time.

    • Output a set of weights for the perceptron, if such a set can be be found within the limit. Otherwise output that the limit has been exceeded.

    • For each data point, show the input vector, the desired response, the perceptron's actual response, and the error value (desired-actual).

    • Optionally output the items in the previous two bullets before each iteration of the algorithm. (This can also be helpful for debugging as well as understanding the algorithm.)

    To make your life simple, by "accept" above, you don't need to provide for external file input for the data. Instead you can code each example case separately directly in code. Also, start with all weights initialized at 0.

    Show your program running on each of the following test cases.

    1. limit = 10, LR = 1
      Input VectorOutput Value
      (0 0)0
      (0 1)1
      (1 0)1
      (1 1)1

    2. limit = 50, LR = 0.5
      Input VectorOutput Value
      (-1 1)1
      (0 0)1
      (1 -1)1
      (1 0)0
      (0 1)0

    3. limit = 50, LR = 0.5
      Input VectorOutput Value
      (0 0)0
      (0 1)1
      (1 0)1
      (1 1)0

    4. limit = 200, LR = 0.1
      Input VectorOutput Value
      (0.2    0.1    0.1    0.1    0.2    0.1    0.2    0.1    0.1)1
      (0.2    0.1    0.1    0.1    0.2    0.1    0.3    0.1    0.1)1
      (0.5    0.1    0.1    0.1    0.2    0.1    0.2    0.1    0.1)1
      (0.5    0.4    0.6    0.8    0.4    0.1    0.8    1.0    0.1)0
      (0.5    0.3    0.3    0.1    0.2    0.1    0.2    0.1    0.1)1
      (0.2    0.3    0.1    0.1    0.3    0.1    0.1    0.1    0.1)1
      (0.3    0.5    0.7    0.8    0.8    0.9    0.7    1.0    0.7)0
      (1.0    0.5    0.6    1.0    0.6    1.0    0.7    0.7    1.0)0