gpt4 book ai didi

amazon-s3 - 使用 Bitbucket Pipeline 将整个 Bitbucket 存储库上传到 S3

转载 作者:行者123 更新时间:2023-12-04 15:08:36 34 4
gpt4 key购买 nike

我正在使用 Bitbuckets 管道。我希望它将我的 repo (非常小)的全部内容推送到 S3。我不想把它拉上 zipper ,推到 S3 然后解压东西。我只是希望它采用我的 Bitbucket 存储库中现有的文件/文件夹结构并将其推送到 S3。

要完成此操作,yaml 文件和 .py 文件应该是什么样的?

这是当前的 yaml 文件:

image: python:3.5.1

pipelines:
branches:
master:
- step:
script:
# - apt-get update # required to install zip
# - apt-get install -y zip # required if you want to zip repository objects
- pip install boto3==1.3.0 # required for s3_upload.py
# the first argument is the name of the existing S3 bucket to upload the artefact to
# the second argument is the artefact to be uploaded
# the third argument is the the bucket key
# html files
- python s3_upload.py my-bucket-name html/index_template.html html/index_template.html # run the deployment script
# Example command line parameters. Replace with your values
#- python s3_upload.py bb-s3-upload SampleApp_Linux.zip SampleApp_Linux # run the deployment script

这是我当前的python:
from __future__ import print_function
import os
import sys
import argparse
import boto3
from botocore.exceptions import ClientError

def upload_to_s3(bucket, artefact, bucket_key):
"""
Uploads an artefact to Amazon S3
"""
try:
client = boto3.client('s3')
except ClientError as err:
print("Failed to create boto3 client.\n" + str(err))
return False
try:
client.put_object(
Body=open(artefact, 'rb'),
Bucket=bucket,
Key=bucket_key
)
except ClientError as err:
print("Failed to upload artefact to S3.\n" + str(err))
return False
except IOError as err:
print("Failed to access artefact in this directory.\n" + str(err))
return False
return True


def main():

parser = argparse.ArgumentParser()
parser.add_argument("bucket", help="Name of the existing S3 bucket")
parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")
parser.add_argument("bucket_key", help="Name of the S3 Bucket key")
args = parser.parse_args()

if not upload_to_s3(args.bucket, args.artefact, args.bucket_key):
sys.exit(1)

if __name__ == "__main__":
main()

这要求我将 yaml 文件中 repo 中的每个文件作为另一个命令列出。我只是想让它抓取所有内容并将其上传到 S3。

最佳答案

您可以更改为使用 docker https://hub.docker.com/r/abesiyo/s3/

它运行得很好

bitbucket-pipelines.yml

image: abesiyo/s3

pipelines:
default:
- step:
script:
- s3 --region "us-east-1" rm s3://<bucket name>
- s3 --region "us-east-1" sync . s3://<bucket name>

还请在 bitbucket 管道上设置环境变量
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY

关于amazon-s3 - 使用 Bitbucket Pipeline 将整个 Bitbucket 存储库上传到 S3,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39538894/

34 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com