Memtropy Signals from the Noise

November 18, 2008

Random Learning Rates in Perceptrons

Filed under: Neural Networks — Tags: , — Tomek @ 1:55 am

Using a random learning factor instead of a fixed one to determine the change of the weight of the connections in a neural network, such as a simple perceptron, improves precision greatly.

The random() function in Python provides a pseudo-random x: 0<=x<1, and should average 0.5. So taking one tenth of that, the learning factor fluctuates around the intended learning rate 0.05. For precisions smaller than 0.05, on average, it is as fast as taking a constant learning rate of 0.05. But now the accuracy theoretically can be arbitrary high, here precision to five significant figures was chosen.

See below for the output and the source code of a simple single layer feed-forward backpropagation neural network finding the function y=5x+1, done in Python.

(more…)

Powered by WordPress