gpt4 book ai didi

python - HuggingFace 保存加载模型 (Colab) 进行预测

转载 作者:行者123 更新时间:2023-12-05 05:59:49 54 4
gpt4 key购买 nike

使用 HuggingFace 训练 Transformer 模型以预测目标变量(例如,电影评级)。我是 Python 的新手,这可能是一个简单的问题,但我不知道如何保存经过训练的分类器模型(通过 Colab)然后重新加载以便对新数据进行目标变量预测。例如,我使用 HuggingFace 资源中的示例训练了一个模型来预测 imbd 评级,如下所示。我尝试了多种方法(save_model、save_pretrained),要么根本无法保存它,要么在加载时不知道调用什么来获得预测。对于涉及保存、加载然后根据测试数据模型创建新预测分数的步骤,我们将不胜感激。

#example mainly from here: https://huggingface.co/transformers/training.html
!pip install transformers
!pip install datasets

from datasets import load_dataset
raw_datasets = load_dataset("imdb")

from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")

def tokenize_function(examples):
return tokenizer(examples["text"], max_length = 128, padding="max_length", truncation=True)

tokenized_datasets = raw_datasets.map(tokenize_function, batched=True)

#choosing small datasets for example#
small_train_dataset = tokenized_datasets["train"].shuffle(seed=42).select(range(1000))
small_eval_dataset = tokenized_datasets["test"].shuffle(seed=42).select(range(500))

### TRAINING classification ###
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained("bert-base-cased", num_labels=2)

from transformers import TrainingArguments
from transformers import Trainer

training_args = TrainingArguments("test_trainer", evaluation_strategy="epoch", num_train_epochs=2, weight_decay=.0001, learning_rate=0.00001, per_device_train_batch_size=32)

trainer = Trainer(model=model, args=training_args, train_dataset=small_train_dataset, eval_dataset=small_eval_dataset)
trainer.train()

y_test_predicted_original = model_loaded.predict(small_eval_dataset)

#### Saving ###
from google.colab import drive
drive.mount('/content/gdrive')
%cd /content/gdrive/My\ Drive/FOLDER

trainer.save_pretrained ("Trained model") #assumed this would save but did not
model.save_pretrained ("Trained model") #did save

### Loading Model and Creating Predicted Scores ###

#perhaps this....#
from transformers import BertConfig, BertModel
conf = BertConfig.from_pretrained("Trained model", num_labels=2)
model_loaded = AutoModelForSequenceClassification.from_pretrained("Trained model", config=conf)

#or...#
model_loaded = AutoModelForSequenceClassification.from_pretrained("Trained model", local_files_only=True)
model_loaded

#with ultimate goal of getting predicted scores (not sure what to call here)...
y_test_predicted_loaded = model_loaded.predict(small_eval_dataset)

最佳答案

保存模型

trainer.save_model("Trained model")

加载模型

model_loaded = AutoModelForSequenceClassification.from_pretrained("Trained model")

预测

trainer = Trainer(model = model_loaded)
test_results = trainer.predict(test_dataset)

关于python - HuggingFace 保存加载模型 (Colab) 进行预测,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/67949960/

54 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com