gpt4 book ai didi

python - Python中softmax函数的导数

转载 作者:行者123 更新时间:2023-12-01 08:16:00 25 4
gpt4 key购买 nike

下面是神经网络的 softmax 激活函数。这个函数的导数是什么?

def softmax(z):
e = np.exp(z)
return e / np.sum(e, axis=1)

最佳答案

softmax导数的迭代版本

import numpy as np

def softmax_grad(s):
# Take the derivative of softmax element w.r.t the each logit which is usually Wi * X
# input s is softmax value of the original input x.
# s.shape = (1, n)
# i.e. s = np.array([0.3, 0.7]), x = np.array([0, 1])

# initialize the 2-D jacobian matrix.
jacobian_m = np.diag(s)

for i in range(len(jacobian_m)):
for j in range(len(jacobian_m)):
if i == j:
jacobian_m[i][j] = s[i] * (1-s[i])
else:
jacobian_m[i][j] = -s[i]*s[j]
return jacobian_m

矢量化版本

def softmax_grad(softmax):
# Reshape the 1-d softmax to 2-d so that np.dot will do the matrix multiplication
s = softmax.reshape(-1,1)
return np.diagflat(s) - np.dot(s, s.T)

引用:https://medium.com/@aerinykim/how-to-implement-the-softmax-derivative-independently-from-any-loss-function-ae6d44363a9d

关于python - Python中softmax函数的导数,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54976533/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com