gpt4 book ai didi

amazon-s3 - 如何配置 `Terraform` 将 zip 文件上传到 `s3` 存储桶,然后将它们部署到 lambda

转载 作者:行者123 更新时间:2023-12-04 12:10:54 31 4
gpt4 key购买 nike

我用 TerraForm作为我的应用程序中的基础架构框架。下面是我用来将 python 代码部署到 lambda 的配置。它分为三个步骤: 1. 将所有依赖项和源代码压缩到一个 zip 文件中; 2.将压缩文件上传到s3存储桶; 3. 部署到 lambda 函数。

但是发生的是部署命令 terraform apply将失败并出现以下错误:

Error: Error modifying Lambda Function Code quote-crawler: InvalidParameterValueException: Error occurred while GetObject. S3 Error Code: NoSuchKey. S3 Error Message: The specified key does not exist.
status code: 400, request id: 2db6cb29-8988-474c-8166-f4332d7309de

on config.tf line 48, in resource "aws_lambda_function" "test_lambda":
48: resource "aws_lambda_function" "test_lambda" {



Error: Error modifying Lambda Function Code praw_crawler: InvalidParameterValueException: Error occurred while GetObject. S3 Error Code: NoSuchKey. S3 Error Message: The specified key does not exist.
status code: 400, request id: e01c83cf-40ee-4919-b322-fab84f87d594

on config.tf line 67, in resource "aws_lambda_function" "praw_crawler":
67: resource "aws_lambda_function" "praw_crawler" {


这意味着 s3 存储桶中不存在部署文件。但是当我第二次运行命令时它成功了。这似乎是一个时间问题。将 zip 文件上传到 s3 存储桶后,该 zip 文件在 s3 存储桶中不存在。这就是第一次部署失败的原因。但是几秒钟后,第二个命令成功且非常快地完成。我的配置文件有什么问题吗?

terraform可以找到配置文件: https://github.com/zhaoyi0113/quote-datalake/blob/master/config.tf

最佳答案

您需要正确添加依赖项才能实现此目的,否则会崩溃。

首先压缩文件

# Zip the Lamda function on the fly
data "archive_file" "source" {
type = "zip"
source_dir = "../lambda-functions/loadbalancer-to-es"
output_path = "../lambda-functions/loadbalancer-to-es.zip"
}

然后通过指定依赖哪个 zip 来上传它 s3, source = "${data.archive_file.source.output_path}"这将使它依赖于 zip
# upload zip to s3 and then update lamda function from s3
resource "aws_s3_bucket_object" "file_upload" {
bucket = "${aws_s3_bucket.bucket.id}"
key = "lambda-functions/loadbalancer-to-es.zip"
source = "${data.archive_file.source.output_path}" # its mean it depended on zip
}

然后你就可以去部署 Lambda,为了让它依赖这条线做魔术 s3_key = "${aws_s3_bucket_object.file_upload.key}"
  resource "aws_lambda_function" "elb_logs_to_elasticsearch" {
function_name = "alb-logs-to-elk"
description = "elb-logs-to-elasticsearch"
s3_bucket = "${var.env_prefix_name}${var.s3_suffix}"
s3_key = "${aws_s3_bucket_object.file_upload.key}" # its mean its depended on upload key
memory_size = 1024
timeout = 900
timeouts {
create = "30m"
}
runtime = "nodejs8.10"
role = "${aws_iam_role.role.arn}"
source_code_hash = "${base64sha256(data.archive_file.source.output_path)}"
handler = "index.handler"

}

关于amazon-s3 - 如何配置 `Terraform` 将 zip 文件上传到 `s3` 存储桶,然后将它们部署到 lambda,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57145037/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com