gpt4 book ai didi

amazon-web-services - AWS DataPipeline 通过 Cloudformation 引发错误 'type is not defined in fields'

转载 作者:行者123 更新时间:2023-12-03 07:39:26 24 4
gpt4 key购买 nike

我正在尝试部署 Export DynamoDB Table to S3通过 Cloudformation 生成模板,但从 Cloudformation 获取 type is not Defined in fields 错误。我的所有 PipelineObjects 都有一个 Key,其值为 type,但 Default ParameterObject 除外,因此我不确定错误指的是什么。有人对这里可能发生的事情有任何想法吗?谢谢!

DataPipeline:
Type: AWS::DataPipeline::Pipeline
Properties:
Name: ddb-export
ParameterObjects:
- Attributes:
- Key: type
StringValue: String
- Key: description
StringValue: Region of the DynamoDB table
- Key: default
StringValue: us-west-2
Id: myDDBRegion
- Attributes:
- Key: type
StringValue: String
- Key: description
StringValue: Source DynamoDB table name
Id: myDDBTableName
- Attributes:
- Key: type
StringValue: Double
- Key: description
StringValue: DynamoDB read throughput ratio
- Key: default
StringValue: "0.25"
Id: myDDBReadThroughputRatio
- Attributes:
- Key: type
StringValue: AWS::S3::ObjectKey
- Key: description
StringValue: Output S3 folder
Id: myOutputS3Loc
Activate: false
PipelineObjects:
- Fields:
- Key: scheduleType
StringValue: ondemand
- Key: failureAndRerunMode
StringValue: CASCADE
- Key: role
StringValue: datapipeline-ddb-export
- Key: resourceRole
StringValue: datapipeline-ddb-export-resource
Id: Default
Name: Default
- Fields:
- Key: tableName
RefValue: "#{myDDBTableName}"
- Key: type
StringValue: DynamoDBDataNode
- Key: readThroughputPercent
RefValue: "#{myDDBReadThroughputRatio}"
Id: DDBSourceTable
Name: DDBSourceTable
- Fields:
- Key: type
StringValue: S3DataNode
- Key: directoryPath
StringValue: "#{myOutputS3Loc}/#{format(@scheduledStartTime, 'YYYY-MM-dd-HH-mm-ss')}"
Id: S3BackupLocation
Name: S3BackupLocation
- Fields:
- Key: type
StringValue: EmrCluster
- Key: releaseLabel
StringValue: emr-5.23.0
- Key: masterInstanceType
StringValue: m3.xlarge
- Key: coreInstanceType
StringValue: m3.xlarge
- Key: coreInstanceCount
StringValue: "1"
- Key: region
StringValue: "#{myDDBRegion}"
Id: EmrClusterForBackup
Name: EmrClusterForBackup
- Fields:
- Key: type
StringValue: EmrActivity
- Key: input
RefValue: DDBSourceTable
- Key: output
RefValue: S3BackupLocation
- Key: runsOn
RefValue: EmrClusterForBackup
- Key: resizeClusterBeforeRunning
StringValue: "true"
- Key: maximumRetries
StringValue: "2"
- Key: step
StringValue: s3://dynamodb-dpl-#{myDDBRegion}/emr-ddb-storage-handler/4.11.0/emr-dynamodb-tools-4.11.0-SNAPSHOT-jar-with-dependencies.jar,org.apache.hadoop.dynamodb.tools.DynamoDBExport,#{output.directoryPath},#{input.tableName},#{input.readThroughputPercent}
Id: TableBackupActivity
Name: TableBackupActivity
Metadata:
aws:cdk:path: ddb-export-CELL-dev/DataPipeline

最佳答案

事实证明,Default pipelineObject 需要 typeDefault

关于amazon-web-services - AWS DataPipeline 通过 Cloudformation 引发错误 'type is not defined in fields',我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/73365847/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com