gpt4 book ai didi

python - 向 skflow 添加正则化器

转载 作者:行者123 更新时间:2023-11-28 16:26:00 25 4
gpt4 key购买 nike

我最近从 tensorflow 切换到 skflow。在 tensorflow 中,我们会将 lambda*tf.nn.l2_loss(weights) 添加到我们的损失中。现在我在 skflow 中有以下代码:

def deep_psi(X, y):
layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
preds, loss = skflow.models.logistic_regression(layers, y)
return preds, loss

def exp_decay(global_step):
return tf.train.exponential_decay(learning_rate=0.01,
global_step=global_step,
decay_steps=1000,
decay_rate=0.005)

deep_cd = skflow.TensorFlowEstimator(model_fn=deep_psi,
n_classes=2,
steps=10000,
batch_size=10,
learning_rate=exp_decay,
verbose=True,)

如何以及在何处添加正则化项? Illia 暗示了什么 here但我想不通。

最佳答案

你仍然可以添加额外的组件到损失中,你只需要从 dnn/logistic_regression 中检索权重并将它们添加到损失中:

def regularize_loss(loss, weights, lambda):
for weight in weights:
loss = loss + lambda * tf.nn.l2_loss(weight)
return loss


def deep_psi(X, y):
layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
preds, loss = skflow.models.logistic_regression(layers, y)

weights = []
for layer in range(5): # n layers you passed to dnn
weights.append(tf.get_variable("dnn/layer%d/linear/Matrix" % layer))
# biases are also available at dnn/layer%d/linear/Bias
weights.append(tf.get_variable('logistic_regression/weights'))

return preds, regularize_loss(loss, weights, lambda)

```

注意,变量的路径可以是found here .

此外,我们希望为所有具有变量的层(如 dnnconv2dfully_connected)添加正则化器支持,所以可能是下周的Tensorflow 的夜间构建应该有这样的东西 dnn(.., regularize=tf.contrib.layers.l2_regularizer(lambda))。发生这种情况时,我会更新此答案。

关于python - 向 skflow 添加正则化器,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36597519/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com