We used freely available code written by Salakhutdinov and Hinton, two of the leaders in deep belief networks, for their own research with Deep Boltzmann Machines. The code is available at http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html. When running the code however, we found that it would not run to completion, even when using the exact data used by Salakhutdinov. Many of the matrices did not have the correct dimensions. Furthermore, we needed to alter the code to work with our pattern dataset. Thus, we spent a lot of time debugging the existing code and changing variables to reflect the data we were using.
We also made the code more user friendly; in Salakhutdinov’s paper, he runs the digit data (MNIST) on both a two layer and a three layer network. In the code however, this switch is not easy to make. We altered the code so that input from the user can alter the number of layers in the network. The code can be run on our pattern data using the command patternclassify(threeLayers), where threeLayers is true if we want to use a three layer network and false for a two layer network.
There are some areas that could use some revision. We will continue to add comments to make the code even clearer to an uninformed user. Also, the matrices in backpropclassify.m in the case where epochs are greater than 6 may not be perfect, which we see from the drastic change in misclassification error from epoch 5 to 6.