gpt4 book ai didi

tokenize - 属性错误 : 'GPT2TokenizerFast' object has no attribute 'max_len'

转载 作者:行者123 更新时间:2023-12-05 08:47:41 25 4
gpt4 key购买 nike

我正在使用 huggingface 转换器库并在运行 run_lm_finetuning.py 时收到以下消息:AttributeError:“GPT2TokenizerFast”对象没有属性“max_len”。还有其他人遇到这个问题或知道如何解决吗?谢谢!

我的完整实验运行:mkdir 实验

对于 5 中的纪元做python run_lm_finetuning.py
--model_name_or_path distilgpt2
--model_type gpt2
--train_data_file small_dataset_train_preprocessed.txt
--output_dir experiments/epochs_$epoch
--do_train
--overwrite_output_dir
--per_device_train_batch_size 4
--num_train_epochs $纪元完成

最佳答案

"AttributeError: 'BertTokenizerFast' object has no attribute 'max_len'" Github issue包含修复:

The run_language_modeling.py script is deprecated in favor of language-modeling/run_{clm, plm, mlm}.py.

If not, the fix is to change max_len to model_max_length.

关于tokenize - 属性错误 : 'GPT2TokenizerFast' object has no attribute 'max_len' ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/67089849/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com