gpt4 book ai didi

deep-learning - keras 如何处理多重损失?

转载 作者:行者123 更新时间:2023-12-03 09:23:30 26 4
gpt4 key购买 nike

如果我有类似的东西:

model = Model(inputs = input, outputs = [y1,y2])

l1 = 0.5
l2 = 0.3
model.compile(loss = [loss1,loss2], loss_weights = [l1,l2], ...)

Keras如何处理损失以获得最终损失?
是不是像这样:

final_loss = l1*loss1 + l2*loss2

另外,在训练期间这意味着什么? loss2 是否仅用于更新 y2 来自的层的权重?还是用于模型的所有层?

最佳答案

来自 model documentation :

loss: String (name of objective function) or objective function. See losses. If the model has multiple outputs, you can use a different loss on each output by passing a dictionary or a list of losses. The loss value that will be minimized by the model will then be the sum of all individual losses.

...

loss_weights: Optional list or dictionary specifying scalar coefficients (Python floats) to weight the loss contributions of different model outputs. The loss value that will be minimized by the model will then be the weighted sum of all individual losses, weighted by the loss_weights coefficients. If a list, it is expected to have a 1:1 mapping to the model's outputs. If a tensor, it is expected to map output names (strings) to scalar coefficients.



因此,是的,最终损失将是“所有单个损失的加权总和,由 loss_weights 系数加权”。

您可以查看 code where the loss is calculated .

Also, what does it mean during training? Is the loss2 only used to update the weights on layers where y2 comes from? Or is it used for all the model's layers?



权重通过 backpropagation 更新,因此每个损失只会影响将输入连接到损失的层。

例如:
                        +----+         
> C |-->loss1
/+----+
/
/
+----+ +----+/
-->| A |--->| B |\
+----+ +----+ \
\
\+----+
> D |-->loss2
+----+
  • loss1将影响 A、B 和 C。
  • loss2将影响 A、B 和 D。
  • 关于deep-learning - keras 如何处理多重损失?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49404309/

    26 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com