gpt4 book ai didi

python - 对使用 boto3 将文件上传到 S3 的函数进行单元测试

转载 作者:太空宇宙 更新时间:2023-11-04 00:41:55 26 4
gpt4 key购买 nike

我有这个功能,可以将存档文件上传到 S3 存储桶:

def upload_file_to_s3_bucket(self, bucket, file, key, log):
if not os.path.exists(file):
log.error("File '%s' does not exist." % file)
tools.exit_gracefully(log)
log.info("Uploading file '%s' to bucket '%s' ..." % (file, bucket))
try:
self._s3.upload_file(file, bucket, key)
except botocore.exceptions.ClientError as e:
log.error("Unexpected uploading error : %s" % e)
tools.exit_gracefully(log)
log.info("Uploading finished.")

我想为它做一个单元测试,这是我目前能写的:

class TestUploadFilesToS3(unittest.TestCase):
""" Tests unitaires upload_file_to_s3_bucket"""


def setUp(self):
conf.LOG_FILE = "/tmp/test.log"
conf.BUCKET_OUTPUT="name.of.the.bucket"
conf.Conf.get_level_log()
self.log = logger(conf.LOG_FILE, conf.LEVEL_LOG).logger
tools.create_workdir(self.log)
conf.WORKDIR = os.path.join(conf.LOCAL_DIR, "files/output")
archive = "file_archive.tar.gz"
archivePath = "/tmp/clients/file_archive.tar.gz"
_aws = None

def tearDown(self):
tools.delete_workdir(self.log)
os.remove(conf.LOG_FILE)


def test_upload_file_to_s3_bucket_success(self):
self._aws.upload_file_to_s3_bucket(conf.BUCKET_OUTPUT, archivePath, archive, self._log)

要进行单元测试,我不知道在测试函数test_upload_file_to_s3_bucket_success中应该使用哪个函数Assert,具体比较什么。例如,我可以测试文件的 URL 是否存在......?有任何想法吗?谢谢

最佳答案

您可以使用此库模拟与 S3 的交互:

https://github.com/spulec/moto

关于python - 对使用 boto3 将文件上传到 S3 的函数进行单元测试,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41565766/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com