Memtropy Signals from the Noise

November 18, 2008

Random Learning Rates in Perceptrons

Filed under: Neural Networks — Tags: , — Tomek @ 1:55 am

Using a random learning factor instead of a fixed one to determine the change of the weight of the connections in a neural network, such as a simple perceptron, improves precision greatly.

The random() function in Python provides a pseudo-random x: 0<=x<1, and should average 0.5. So taking one tenth of that, the learning factor fluctuates around the intended learning rate 0.05. For precisions smaller than 0.05, on average, it is as fast as taking a constant learning rate of 0.05. But now the accuracy theoretically can be arbitrary high, here precision to five significant figures was chosen.

See below for the output and the source code of a simple single layer feed-forward backpropagation neural network finding the function y=5x+1, done in Python.

import random, math 

def learn():
    input =[[0,0.99999],[1,5.99999],[0,1],[1,6]]
    target = [0,0,1,1]
    bias = 1  
    errorAbsSum = 1
    epsilon = 1e-6
    w1 = [random.random() for x in range(10)]
    w2 = [random.random() for x in range(10)]
    iterations = 0
    while abs(errorAbsSum) > epsilon:
        iterations +=1
        errorAbsSum = 0
        counter = 0
        tmp1 = (sum(w1) / len(w1))
        tmp2 = (sum(w2) / len(w2))
        for i,x in enumerate(input):
            weighted = x[0] * tmp1 + x[1] * tmp2 - bias
            activation = sgn(weighted)
            error = target[i] - activation
            learningRate = random.random() / 10
            tmp1 += (error * learningRate * x[0])
            tmp2 += (error * learningRate * x[1])
            errorAbsSum += abs(error)
        del w1[0], w2[0]
        w1.append(tmp1)
        w2.append(tmp2)
    return 'Iterations: %d \n w1=%f w2=%f' % (iterations, tmp1, tmp2)
    
def sgn(input):
    if input <= 0:
        return 0
    elif input > 0:
        return 1

Output:Iterations: 10254084 w1=-5.000043 w2=1.000008

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress