gpt4 book ai didi

amazon-web-services - 从 Databricks 上的 Spark 连接到 Redshift 时出错

转载 作者:行者123 更新时间:2023-12-02 03:16:43 24 4
gpt4 key购买 nike

我正在尝试从 Spark 连接到 Redshift(在 Databricks 上运行)

from pyspark.sql import SQLContext

sc._jsc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId", ACCESS_KEY)
sc._jsc.hadoopConfiguration().set("fs.s3n.awsSecretAccessKey", SECRET_KEY)

# IP addresses from Redshift Security Group panel
IP_ADDRESSES_TO_ADD = ["1.2.3.4/32", "5.6.7.8/32"]
PORTS_TO_ADD = ["80", "443"]
PROTOCOLS_TO_ADD = ["tcp"]

# Read data from a query
df = sqlContext.read \
.format("com.databricks.spark.redshift") \
.option("url", "jdbc:redshift://XXX.XXX.eu-west-1.redshift.amazonaws.com:5439/REDSHIFT_DB?user=REDSHIFT_USER&password=REDSHIFT_PW&ssl=true&sslfactory=com.amazon.redshift.ssl.NonValidatingFactory") \
.option("query", "select * FROM REDSHIFT_TABLE LIMIT 10") \
.option("tempdir", "s3n://path/to/temp/") \
.load()

但是我收到以下错误:

java.sql.SQLException: [Amazon](500150) Error setting/closing connection: Connection timed out.

我错过了什么吗?

最佳答案

看起来像是连接错误。请验证您是否是授权用户。

要验证这一点:运行以下命令:

telnet XXX.XXX.eu-west-1.redshift.amazonaws.com 5439

你应该得到这样的东西(如果你是授权用户):

Trying <IP address>...
Connected to <Host name>.
Escape character is '^]'.

但是如果你会得到:connection time out,这意味着你不是授权用户。

关于amazon-web-services - 从 Databricks 上的 Spark 连接到 Redshift 时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36547066/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com