gpt4 book ai didi

neural-network - 如何在 Keras 中添加正交正则化?

转载 作者:行者123 更新时间:2023-12-03 14:18:55 26 4
gpt4 key购买 nike

我想用

|(W^T * W - I)|

我怎么能在 Keras 中做到这一点?

最佳答案

从文档:

Any function that takes in a weight matrix and returns a loss contribution tensor can be used as a regularizer



这是实现的示例:
from keras import backend as K

def l1_reg(weight_matrix):
return 0.01 * K.sum(K.abs(weight_matrix))

model.add(Dense(64, input_dim=64,
kernel_regularizer=l1_reg)

您帖子中的损失将是:
from keras import backend as K
def fro_norm(w):
return K.sqrt(K.sum(K.square(K.abs(w))))

def cust_reg(w):
m = K.dot(K.transpose(w), w) - np.eye(w.shape)
return fro_norm(m)

这是一个最小的例子:
import numpy as np
from keras import backend as K
from keras.models import Sequential
from keras.layers import Dense, Activation

X = np.random.randn(100, 100)
y = np.random.randint(2, size=(100, 1))

model = Sequential()

# apply regularization here. applies regularization to the
# output (activation) of the layer
model.add(Dense(32, input_shape=(100,),
activity_regularizer=fro_norm))
model.add(Dense(1))
model.add(Activation('softmax'))

model.compile(loss="binary_crossentropy",
optimizer='sgd',
metrics=['accuracy'])

model.fit(X, y, epochs=1, batch_size=32)

下面不会像@Marcin 的评论所暗示的那样工作 LA.norm 不会工作,因为正则化器必须返回一个张量 LA.norm()没有。
def orth_norm(w)
m = K.dot(k.transpose(w), w) - np.eye(w.shape)
return LA.norm(m, 'fro')

from keras import backend as K

import numpy as np

def orth_norm(w)
m = K.dot(k.transpose(w), w) - np.eye(w.shape)
return LA.norm(m, 'fro')

Keras regularizers

Frobenias norm

关于neural-network - 如何在 Keras 中添加正交正则化?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42911671/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com