gpt4 book ai didi

python - 与 tensorflow keras 的自定义损失混淆

转载 作者:行者123 更新时间:2023-12-05 06:07:24 25 4
gpt4 key购买 nike

我正在尝试遵循为 tensorflow.keras 创建自定义损失函数的多种变体

我已经成功创建了一个似乎有效的自定义指标,现在我想在计算损失时使用该指标。

这里是自定义指标,它计算 y_true 和 y_pred 之间的 spearman 秩相关...

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import tensorflow_probability as tfp

class Correlation(keras.metrics.Metric):
def __init__(self, name="Correlation", **kwargs):
super(Correlation, self).__init__(name=name, **kwargs)
self.metric = self.add_weight(name='correlation_01', initializer='zeros')

def update_state(self, y_true, y_pred, sample_weight=None):
y_true_flat = layers.Flatten()(y_true)
y_pred_flat = layers.Flatten()(y_pred)
y_true_rank = tf.cast(tf.argsort(y_true_flat, axis=0, direction="ASCENDING"), 'float32')
y_pred_rank = tf.cast(tf.argsort(y_pred_flat, axis=0, direction="ASCENDING"), 'float32')
cov = tfp.stats.covariance(y_true_rank, y_pred_rank, sample_axis=0, event_axis=None)
std_y_trueR = tfp.stats.stddev(y_true_rank, sample_axis=0, keepdims=False, name="STD_TRUE")
std_y_predR = tfp.stats.stddev(y_pred_rank, sample_axis=0, keepdims=False, name="STD_PRED")
#corr = cov/(std_y_trueR * std_y_predR)
corr = tf.math.divide(cov,tf.math.multiply(std_y_trueR,std_y_predR))

self.metric.assign(corr[0])

def result(self):
return self.metric

def reset_states(self):
# reset state of metric at the start of each epoch
self.metric.assign(0.0)

我不知道如何将这直接与损失联系起来,所以我想我应该从遵循文档中的示例开始,这些示例建议创建一个函数并将其传递给编译。所以我复制了上面的代码,并通过从 1.0 中减去相关性将其转换为损失

def correlation_loss(y_true, y_pred):
y_true_flat = layers.Flatten()(y_true)
y_pred_flat = layers.Flatten()(y_pred)
y_true_rank = tf.cast(tf.argsort(y_true_flat, axis=0, direction="ASCENDING"), 'float32')
y_pred_rank = tf.cast(tf.argsort(y_pred_flat, axis=0, direction="ASCENDING"), 'float32')
cov = tfp.stats.covariance(y_true_rank, y_pred_rank, sample_axis=0, event_axis=None)
std_y_trueR = tfp.stats.stddev(y_true_rank, sample_axis=0, keepdims=False, name="LOSS_STD_TRUE")
std_y_predR = tfp.stats.stddev(y_pred_rank, sample_axis=0, keepdims=False, name="LOSS_STD_PRED")
corr = tf.math.divide(cov,tf.math.multiply(std_y_trueR,std_y_predR))
loss = tf.math.subtract(1.0,corr[0])
return loss

我用以下方法编译模型:

model.compile(optimizer='Adam', loss=correlation_loss, metrics=[Correlation()])

然而,当我尝试训练模型时,我得到一个关于 No gradients provided for any variable

ValueError

我很想知道我做错了什么。更具体地说,是否有在损失计算中使用不涉及第二次重新计算的指标的推荐方法?

最佳答案

谢谢你指点我 other post基于对它的仔细阅读,我调整了我的函数以确保每一层都被标记并且似乎解决了这个问题。

我的函数还有一个问题,就是它没有正确计算 Spearman 等级相关性。我在这里发布了我的代码的解决方案,其中排名部分被注释掉了,因此它将返回标准的 Pearson Correlation,但这似乎至少适用于训练循环。

def correlation_loss(y_true, y_pred):
y_true_flat = layers.Flatten(name="Y_TRUE_FLAT")(y_true)
y_pred_flat = layers.Flatten(name="Y_PRED_FLAT")(y_pred)
##-## I can't seem to get the proper ranking for Spearman Correlation
##-## Just to have something that functions, I've commented these out for now
#y_true_rank = tf.cast(tf.argsort(y_true_flat, axis=0, direction="ASCENDING"), 'float32')
#y_pred_rank = tf.cast(tf.argsort(y_pred_flat, axis=0, direction="ASCENDING"), 'float32')
cov = tfp.stats.covariance(y_true_flat, y_pred_flat, sample_axis=0, event_axis=None, name="COVARIANCE")
std_y_trueR = tfp.stats.stddev(y_true_flat, sample_axis=0, keepdims=False, name="LOSS_STD_TRUE")
std_y_predR = tfp.stats.stddev(y_pred_flat, sample_axis=0, keepdims=False, name="LOSS_STD_PRED")
corr = tf.math.divide(cov,tf.math.multiply(std_y_trueR,std_y_predR, name="MULT_STDs"), name="CORRELATION")
loss = tf.math.subtract(1.0,corr[0], name="CORR_LOSS")
return loss

关于python - 与 tensorflow keras 的自定义损失混淆,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/65445326/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com