gpt4 book ai didi

python-3.x - 如何将机器学习模型 pickle 存储到 azure blob 并检索它/

转载 作者:行者123 更新时间:2023-12-02 07:23:32 25 4
gpt4 key购买 nike

我已经从我的机器学习模型创建了一个pickle,它被保存在本地。我想将其推送到 azure blob 存储并希望稍后检索它。我怎样才能使用 python 3 做到这一点。请帮忙。

'''

#model
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size = 0.3, random_state = 100)
regressor = LinearRegression()
regressor.fit(X_train, Y_train)

#Creating the pickle file finalized_model.sav on my local

mypickle = 'finalized_model.sav'
pickle.dump(regressor, open(mupickle, 'wb'))

'''

I tried following for .csv importing and pushing from azure, but don't know how to do with pickle files.

'''

with BytesIO() as input_blob:

block_blob_service = BlockBlobService(account_name='*****',
account_key='*********************************************************************')
block_blob_service.get_blob_to_stream('blobcontainer', 'claims.csv', input_blob)
input_blob.seek(0)
dataframe_blobdata = pd.read_csv(input_blob)

#transforming the data in between

output = dataframe_blobdata.to_csv (index_label="idx", encoding = "utf-8")

block_blob_service.create_blob_from_text('secondforblobcontainer', 'OutFilePy.csv', output)

'''

最佳答案

根据我的理解,您只想将名为 finalized_model.sav 的图片文件上传到 Azure 存储。

那么我建议您使用azure-storage-blob SDK来上传blob。这是官方示例:Code examples

具体来说,首先您需要从门户获取存储帐户的连接字符串,然后使用该连接字符串创建一个 BlobServiceClient,之后您可以通过 创建容器客户端blob_service_client.create_container(container_name)。最后,您可以创建一个blob客户端,然后根据其路径上传本地文件。

而且从 Azure 存储下载回来也很容易。样本为 downloading blobs :

with open(download_file_path, "wb") as download_file:
download_file.write(blob_client.download_blob().readall())

关于python-3.x - 如何将机器学习模型 pickle 存储到 azure blob 并检索它/,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/60287568/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com