gpt4 book ai didi

python - 调试反向传播算法

转载 作者:塔克拉玛干 更新时间:2023-11-03 03:14:38 24 4
gpt4 key购买 nike

我正在尝试在 python 中使用 numpy 实现反向传播算法。我一直在用this site实现反向传播的矩阵形式。在 XOR 上测试此代码时,即使在多次运行数千次迭代后,我的网络也不会收敛。我认为存在某种逻辑错误。如果有人愿意查看它,我将不胜感激。可以在 github 找到完全可运行的代码

import numpy as np

def backpropagate(network, tests, iterations=50):

#convert tests into numpy matrices
tests = [(np.matrix(inputs, dtype=np.float64).reshape(len(inputs), 1),
np.matrix(expected, dtype=np.float64).reshape(len(expected), 1))
for inputs, expected in tests]

for _ in range(iterations):

#accumulate the weight and bias deltas
weight_delta = [np.zeros(matrix.shape) for matrix in network.weights]
bias_delta = [np.zeros(matrix.shape) for matrix in network.bias]

#iterate over the tests
for potentials, expected in tests:

#input the potentials into the network
#calling the network with trace == True returns a list of matrices,
#representing the potentials of each layer
trace = network(potentials, trace=True)
errors = [expected - trace[-1]]

#iterate over the layers backwards
for weight_matrix, layer in reversed(list(zip(network.weights, trace))):
#compute the error vector for a layer
errors.append(np.multiply(weight_matrix.transpose()*errors[-1],
network.sigmoid.derivative(layer)))

#remove the input layer
errors.pop()
errors.reverse()

#compute the deltas for bias and weight
for index, error in enumerate(errors):
bias_delta[index] += error
weight_delta[index] += error * trace[index].transpose()

#apply the deltas
for index, delta in enumerate(weight_delta):
network.weights[index] += delta
for index, delta in enumerate(bias_delta):
network.bias[index] += delta

此外,这是计算输出的代码和我的 sigmoid 函数。错误不太可能存在于此;我能够使用模拟退火训练一个网络来模拟 XOR。

# the call function of the neural network
def __call__(self, potentials, trace=True):

#ensure the input is properly formated
potentials = np.matrix(potentials, dtype=np.float64).reshape(len(potentials), 1)

#accumulate the trace
trace = [potentials]

#iterate over the weights
for index, weight_matrix in enumerate(self.weights):
potentials = weight_matrix * potentials + self.bias[index]
potentials = self.sigmoid(potentials)
trace.append(potentials)

return trace

#The sigmoid function that is stored in the network
def sigmoid(x):
return np.tanh(x)
sigmoid.derivative = lambda x : (1-np.square(x))

最佳答案

问题是缺少步长参数。梯度应该被额外缩放,而不是一次在权重空间中进行整个步骤。所以不是:network.weights[index] += deltanetwork.bias[index] += delta 它应该是:

def backpropagate(network, tests, stepSize = 0.01, iterations=50):

#...

network.weights[index] += stepSize * delta

#...

network.bias[index] += stepSize * delta

关于python - 调试反向传播算法,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19991431/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com