gpt4 book ai didi

apache-spark - 如何从 Spark 建立区域/直流感知连接到 Cassandra?

转载 作者:行者123 更新时间:2023-12-02 03:09:26 25 4
gpt4 key购买 nike

我正在使用spark-sql 2.4.1、spark-cassandra-connector_2.11-2.4.1.jar和java8。当我尝试从表中获取数据时,我遇到了

java.io.IOException: Failed to write statements to keyspace1.model_vals. The
latest exception was
An unexpected error occurred server side on cassandra-node1: com.google.common.util.concurrent.UncheckedExecutionException: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: org.apache.cassandra.exceptions.ReadTimeoutException: Operation timed out - received only 0 responses.

那么如何从 Spark 代码建立到 Cassandra 数据库的区域/直流感知连接?

YML

existing one

spring:
  data:
      cassandra:
        keyspace-name: raproduct
        contact-points:
                    - cassandra-node1
                    - cassandra-node2
        port: 9042

Changed to

spring:
  data:
      cassandra:
        connection:
          local_dc: southeast-1
        keyspace-name: raproduct
        contact-points:
                    - cassandra-node1
                    - cassandra-node2
        port: 9042

Question

但它没有反射(reflect)/应用更改后的“local_dc”。如何在 spring-data 中做到这一点?

最佳答案

检查Spark Connector documentation并在 Configuration Reference - Cassandra Connection Parameters 。这似乎可以通过在连接配置中设置 spark.cassandra.connection.local_dc 属性来完成:

val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "192.168.1.10")
.set("spark.cassandra.auth.username", "flynn")
.set("spark.cassandra.auth.password", "reindeerFlotilla82")
.set("spark.cassandra.connection.local_dc", "encom_west1_dc")

val sc = new SparkContext("spark://192.168.1.133:7077", "test", conf)

不确定您的连接配置代码是什么样的,但请尝试设置 spark.cassandra.connection.local_dc 属性,看看会发生什么。

关于apache-spark - 如何从 Spark 建立区域/直流感知连接到 Cassandra?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58095195/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com