gpt4 book ai didi

azure - Apache Flink Operator - 启用 azure-fs-hadoop

转载 作者:行者123 更新时间:2023-12-03 02:12:40 27 4
gpt4 key购买 nike

我正在尝试在 k8s 上使用 Flink Operator (https://github:com/apache/flink-kubernetes-operator) 执行 Flink 作业,该作业使用与此处所述的 Azure Blob 存储的连接:https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/deployment/filesystems/azure/

按照指南,我需要将 jar 文件 flink-azure-fs-hadoop-1.15.0.jar 从一个目录复制到另一个目录。

我已经尝试通过 podTemplate 和命令功能来完成此操作,但不幸的是它不起作用,并且该文件没有出现在目标目录中。

你能指导我如何正确地做到这一点吗?您可以在下面找到我的 FlinkDeployment 文件。

apiVersion: flink.apache.org/v1beta1
kind: FlinkDeployment
metadata:
namespace: flink
name: basic-example
spec:
image: flink:1.15
flinkVersion: v1_15
flinkConfiguration:
taskmanager.numberOfTaskSlots: "2"
serviceAccount: flink
podTemplate:
apiVersion: v1
kind: Pod
metadata:
name: pod-template
spec:
serviceAccount: flink
containers:
- name: flink-main-container
volumeMounts:
- mountPath: /opt/flink/data
name: flink-data
# command:
# - "touch"
# - "/tmp/test.txt"
volumes:
- name: flink-data
emptyDir: { }

jobManager:
resource:
memory: "2048m"
cpu: 1
podTemplate:
apiVersion: v1
kind: Pod
metadata:
name: job-manager-pod-template
spec:
initContainers:
- name: fetch-jar
image: cirrusci/wget
volumeMounts:
- mountPath: /opt/flink/data
name: flink-data
command:
- "wget"
- "LINK_TO_CUSTOM_JAR_FILE_ON_AZURE_BLOB_STORAGE"
- "-O"
- "/opt/flink/data/test.jar"
containers:
- name: flink-main-container
command:
- "touch"
- "/tmp/test.txt"
taskManager:
resource:
memory: "2048m"
cpu: 1
job:
jarURI: local:///opt/flink/data/test.jar
parallelism: 2
upgradeMode: stateless
state: running
ingress:
template: "CUSTOM_LINK_TO_AZURE"
annotations:
cert-manager.io/cluster-issuer: letsencrypt
kubernetes.io/ingress.allow-http: 'false'
traefik.ingress.kubernetes.io/router.entrypoints: websecure
traefik.ingress.kubernetes.io/router.tls: 'true'
traefik.ingress.kubernetes.io/router.tls.options: default

最佳答案

由于您使用的是库存 Flink 1.15 镜像,因此此 Azure 文件系统插件是内置的。您可以通过设置 ENABLE_BUILT_IN_PLUGINS 环境变量来启用它。

spec:
podTemplate:
containers:
# Do not change the main container name
- name: flink-main-container
env:
- name: ENABLE_BUILT_IN_PLUGINS
value: flink-azure-fs-hadoop-1.15.0.jar

https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/resource-providers/standalone/docker/#using-filesystem-plugins

关于azure - Apache Flink Operator - 启用 azure-fs-hadoop,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/72826712/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com