Monday 3 December 2007

ANN Backpropagation Solved


As I had previously posted I was having trouble with my bipoloar neural network. A bipolor neural network deals with input and output between 1 and -1 this allows for faster training as negative values can be passed through the network.
During backpropogation the weights will be changed in the network till the output is the desired output, however in changing the output sometimes the output will become 0. In this case when we use the output to calculate other values such as the error, this will also become 0 as we are multiplying by the output of 0. Hence everything will eventually be set to 0 and no more learning will occur.

This problem has been solved using an imput bias which can be seen in the above diagram. Each of the layers in the network have a bias representing them and the bias input is added to the sum of the rest of the inputs this makes sure that the output will never become 0.
So after "jigging" (yes, that is a technical term) about with the code I got it to work with bipolar input and the result is a faster training network.
Now all i need to now is turn all the aspects of the code into proper objects so that i can easier make multilayered network with "n" neurons

No comments: