gpt4 book ai didi

The correct way to use CTCLoss in pytorch?(在柴火中使用CTCLoss的正确方法是什么?)

转载 作者:bug小助手 更新时间:2023-10-25 13:13:26 25 4
gpt4 key购买 nike



I have been training a conv-lstm network, the conv net takes in an input of (batch, 1, 75, 46, 146) and outputs a tensor of shape(batch , 10) which is then fed into the lstm network.

我一直在训练一个Conv-LSTM网络,Conv网络接受(Batch,1,75,46,146)的输入,并输出形状张量(Batch,10),然后将其馈入LSTM网络。


The model, however, doesn't seem to learn anything, I think I've given the wrong inputs to the ctc loss I am using(it seems to work slightly better with crossentropy loss).

然而,这个模型似乎没有学到任何东西,我认为我给我使用的ctc损失输入了错误的信息(它似乎在交叉熵损失下工作得更好)。


Here the sentence is the output of the model of shape ( batch, 35, 40)
where 35 is the length of every sentence and 40 is the number of classes.

这里的句子是形状模型(Batch,35,40)的输出,其中35是每个句子的长度,40是班数。


sentence = torch.reshape(sentence , (35 , sentence.shape[0] , 40))
input_lengths = torch.full(size=(x.shape[0],), fill_value=35, dtype= torch.int)
loss = criterion(sentence , y, input_lengths , input_lengths)

where criterion is defined as nn.CTCLoss()
but the model doesn't seem to learn anything and gives the same gibberish prediction every epoch.

其中,Criteria被定义为nn.CTCLoss(),但该模型似乎没有学习到任何东西,并且每个时期都给出相同的胡言乱语的预测。


What is wrong here am I not using the ctcloss correctly or something??

这里有什么问题吗?我是不是没有正确使用CTCloss还是什么?


更多回答
优秀答案推荐
更多回答

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com