gpt4 book ai didi

tensorflow - 用于在 keras 中调用的自定义宏

转载 作者:行者123 更新时间:2023-11-30 09:18:35 25 4
gpt4 key购买 nike

我正在尝试为 recall = (recall of class1 + recall of class2)/2 创建自定义宏。我想出了以下代码,但我不知道如何计算 0 类的真阳性。

def unweightedRecall():
def recall(y_true, y_pred):
# recall of class 1
true_positives1 = K.sum(K.round(K.clip(y_pred * y_true, 0, 1)))
possible_positives1 = K.sum(K.round(K.clip(y_true, 0, 1)))
recall1 = true_positives1 / (possible_positives1 + K.epsilon())

# --- get true positive of class 0 in true_positives0 here ---
# Also, is there a cleaner way to get possible_positives0
possible_positives0 = K.int_shape(y_true)[0] - possible_positives1
recall0 = true_positives0 / (possible_positives0 + K.epsilon())
return (recall0 + recall1)/2
return recall

看来我必须使用Keras.backend.equal(x, y),但是如何创建形状为K.int_shape(y_true)[0]<的张量 和所有值,比如 x?

<小时/>

编辑1

根据 Marcin 的评论,我想创建一个基于 keras 回调的自定义指标。而browsing issues in Keras ,我遇到了以下 f1 指标代码:

class Metrics(keras.callbacks.Callback):
def on_epoch_end(self, batch, logs={}):
predict = np.asarray(self.model.predict(self.validation_data[0]))
targ = self.validation_data[1]
self.f1s=f1(targ, predict)
return
metrics = Metrics()
model.fit(X_train, y_train, epochs=epochs, batch_size=batch_size, validation_data=[X_test,y_test],
verbose=1, callbacks=[metrics])

但是回调是如何返回准确度的呢?我想实现未加权召回=(召回类别1 + 召回类别2)/2。我可以想到以下代码,但希望能帮助完成它

from sklearn.metrics import recall_score
class Metrics(keras.callbacks.Callback):
def on_epoch_end(self, batch, logs={}):
predict = np.asarray(self.model.predict(self.validation_data[0]))
targ = self.validation_data[1]
# --- what to store the result in?? ---
self.XXXX=recall_score(targ, predict, average='macro')
# we really dont need to return anything ??
return
metrics = Metrics()
model.fit(X_train, y_train, epochs=epochs, batch_size=batch_size, validation_data=[X_test,y_test],
verbose=1, callbacks=[metrics])
<小时/>

编辑2:型号:

def createModelHelper(numNeurons=40, optimizer='adam'):
inputLayer = Input(shape=(data.shape[1],))
denseLayer1 = Dense(numNeurons)(inputLayer)
outputLayer = Dense(1, activation='sigmoid')(denseLayer1)
model = Model(input=inputLayer, output=outputLayer)
model.compile(loss=unweightedRecall, optimizer=optimizer)
return model

最佳答案

keras 版本(有平均问题)。

您的两个类实际上只有一维输出(0 或 1)吗?

如果是这样:

def recall(y_true, y_pred):
# recall of class 1

#do not use "round" here if you're going to use this as a loss function
true_positives = K.sum(K.round(y_pred) * y_true)
possible_positives = K.sum(y_true)
return true_positives / (possible_positives + K.epsilon())


def unweightedRecall(y_true, y_pred):
return (recall(y_true,y_pred) + recall(1-y_true,1-y_pred))/2.
<小时/>

现在,如果您的两个类实际上是 2 元素输出:

def unweightedRecall(y_true, y_pred):
return (recall(y_true[:,0],y_pred[:,0]) + recall(y_true[:,1],y_pred[:,1]))/2.
<小时/>

回调版本:

对于回调,您可以使用LambdaCallback,然后手动打印或存储结果:

myCallBack = LambdaCallback(on_epoch_end=unweightedRecall)
stored_metrics = []

def unweightedRecall(epoch,logs):
predict = model.predict(self.validation_data[0])
targ = self.validation_data[1]

result = (recall(targ,predict) + recall(1-targ,1-predict))/2.
print("recall for epoch " + str(epoch) + ": " + str(result))
stored_metrics.append(result)

其中recall是一个使用np而不是K的函数。并且epsilon = np.finfo(float).epsepsilon = np.finfo(np.float32).eps)

关于tensorflow - 用于在 keras 中调用的自定义宏,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48757091/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com