- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我想使用 JDBC 连接到 Delta,并想在本地模式下运行 Spark Thrift 服务器 (STS) 以启动轮胎。
我使用以下命令启动 STS:
$SPARK_HOME/sbin/start-thriftserver.sh \
--conf spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
--packages 'io.delta:delta-core_2.12:1.0.0'
尝试使用直线连接时出现以下错误:
Beeline version 2.3.7 by Apache Hive
beeline> !connect jdbc:hive2://localhost:10000
Connecting to jdbc:hive2://localhost:10000
Enter username for jdbc:hive2://localhost:10000:
Enter password for jdbc:hive2://localhost:10000:
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: Failed to open new session: org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'spark_catalog': org.apache.spark.sql.delta.catalog.DeltaCatalog (state=08S01,code=0)
查看 thrift 服务器日志,我没有看到任何明显的错误,delta jar 文件的加载没有任何错误。
Spark Command: /Users/sandbox/.sdkman/candidates/java/8.0.282.j9-adpt/bin/java -cp /Users/sandbox/projects/delta/spark-3.1.2-bin-hadoop3.2/conf/:/Users/sandbox/projects/delta/spark-3.1.2-bin-hadoop3.2/jars/* -Xmx1g org.apache.spark.deploy.SparkSubmit --conf spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension --conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name Thrift JDBC/ODBC Server --packages io.delta:delta-core_2.12:1.0.0 spark-internal
========================================
:: loading settings :: url = jar:file:/Users/sandbox/projects/delta/spark-3.1.2-bin-hadoop3.2/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /Users/sandbox/.ivy2/cache
The jars for the packages stored in: /Users/sandbox/.ivy2/jars
io.delta#delta-core_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-8f65b233-668d-46ea-98d6-0da629b12d1e;1.0
confs: [default]
found io.delta#delta-core_2.12;1.0.0 in central
found org.antlr#antlr4;4.7 in central
found org.antlr#antlr4-runtime;4.7 in central
found org.antlr#antlr-runtime;3.5.2 in central
found org.antlr#ST4;4.0.8 in central
found org.abego.treelayout#org.abego.treelayout.core;1.0.3 in central
found org.glassfish#javax.json;1.0.4 in central
found com.ibm.icu#icu4j;58.2 in central
:: resolution report :: resolve 385ms :: artifacts dl 13ms
:: modules in use:
com.ibm.icu#icu4j;58.2 from central in [default]
io.delta#delta-core_2.12;1.0.0 from central in [default]
org.abego.treelayout#org.abego.treelayout.core;1.0.3 from central in [default]
org.antlr#ST4;4.0.8 from central in [default]
org.antlr#antlr-runtime;3.5.2 from central in [default]
org.antlr#antlr4;4.7 from central in [default]
org.antlr#antlr4-runtime;4.7 from central in [default]
org.glassfish#javax.json;1.0.4 from central in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 8 | 0 | 0 | 0 || 8 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-8f65b233-668d-46ea-98d6-0da629b12d1e
confs: [default]
0 artifacts copied, 8 already retrieved (0kB/13ms)
21/11/06 07:28:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/11/06 07:28:30 INFO HiveThriftServer2: Started daemon with process name: 75591@MacBook-Pro
21/11/06 07:28:30 INFO SignalUtils: Registering signal handler for TERM
21/11/06 07:28:30 INFO SignalUtils: Registering signal handler for HUP
21/11/06 07:28:30 INFO SignalUtils: Registering signal handler for INT
21/11/06 07:28:30 INFO HiveThriftServer2: Starting SparkContext
21/11/06 07:28:31 INFO HiveConf: Found configuration file null
21/11/06 07:28:31 INFO SparkContext: Running Spark version 3.1.2
21/11/06 07:28:31 INFO ResourceUtils: ==============================================================
21/11/06 07:28:31 INFO ResourceUtils: No custom resources configured for spark.driver.
21/11/06 07:28:31 INFO ResourceUtils: ==============================================================
21/11/06 07:28:31 INFO SparkContext: Submitted application: Thrift JDBC/ODBC Server
21/11/06 07:28:31 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
21/11/06 07:28:31 INFO ResourceProfile: Limiting resource is cpu
21/11/06 07:28:31 INFO ResourceProfileManager: Added ResourceProfile id: 0
...
21/11/06 07:28:31 INFO Utils: Successfully started service 'sparkDriver' on port 55507.
...
21/11/06 07:28:32 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/11/06 07:28:32 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://macbook-pro:4040
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/io.delta_delta-core_2.12-1.0.0.jar at spark://macbook-pro:55507/jars/io.delta_delta-core_2.12-1.0.0.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-4.7.jar at spark://macbook-pro:55507/jars/org.antlr_antlr4-4.7.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-runtime-4.7.jar at spark://macbook-pro:55507/jars/org.antlr_antlr4-runtime-4.7.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar at spark://macbook-pro:55507/jars/org.antlr_antlr-runtime-3.5.2.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/org.antlr_ST4-4.0.8.jar at spark://macbook-pro:55507/jars/org.antlr_ST4-4.0.8.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar at spark://macbook-pro:55507/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar at spark://macbook-pro:55507/jars/org.glassfish_javax.json-1.0.4.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added JAR file:///Users/sandbox/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar at spark://macbook-pro:55507/jars/com.ibm.icu_icu4j-58.2.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/io.delta_delta-core_2.12-1.0.0.jar at file:///Users/sandbox/.ivy2/jars/io.delta_delta-core_2.12-1.0.0.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/io.delta_delta-core_2.12-1.0.0.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/io.delta_delta-core_2.12-1.0.0.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-4.7.jar at file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-4.7.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/org.antlr_antlr4-4.7.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-4.7.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-runtime-4.7.jar at file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-runtime-4.7.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/org.antlr_antlr4-runtime-4.7.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-runtime-4.7.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar at file:///Users/sandbox/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr-runtime-3.5.2.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/org.antlr_ST4-4.0.8.jar at file:///Users/sandbox/.ivy2/jars/org.antlr_ST4-4.0.8.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/org.antlr_ST4-4.0.8.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_ST4-4.0.8.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar at file:///Users/sandbox/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar at file:///Users/sandbox/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.glassfish_javax.json-1.0.4.jar
21/11/06 07:28:32 INFO SparkContext: Added file file:///Users/sandbox/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar at file:///Users/sandbox/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: Copying /Users/sandbox/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/com.ibm.icu_icu4j-58.2.jar
21/11/06 07:28:32 INFO Executor: Starting executor ID driver on host macbook-pro
21/11/06 07:28:32 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-runtime-4.7.jar with timestamp 1636183711298
21/11/06 07:28:32 INFO Utils: /Users/sandbox/.ivy2/jars/org.antlr_antlr4-runtime-4.7.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-runtime-4.7.jar
21/11/06 07:28:32 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/com.ibm.icu_icu4j-58.2.jar
21/11/06 07:28:33 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar
21/11/06 07:28:33 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.glassfish_javax.json-1.0.4.jar
21/11/06 07:28:33 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr-runtime-3.5.2.jar
21/11/06 07:28:33 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/org.antlr_antlr4-4.7.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/org.antlr_antlr4-4.7.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-4.7.jar
21/11/06 07:28:33 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/org.antlr_ST4-4.0.8.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/org.antlr_ST4-4.0.8.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_ST4-4.0.8.jar
21/11/06 07:28:33 INFO Executor: Fetching file:///Users/sandbox/.ivy2/jars/io.delta_delta-core_2.12-1.0.0.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: /Users/sandbox/.ivy2/jars/io.delta_delta-core_2.12-1.0.0.jar has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/io.delta_delta-core_2.12-1.0.0.jar
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO TransportClientFactory: Successfully created connection to MacBook-Pro/192.168.1.207:55507 after 53 ms (0 ms spent in bootstraps)
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7179925225828426556.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7179925225828426556.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/org.glassfish_javax.json-1.0.4.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/org.glassfish_javax.json-1.0.4.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp6953292115063484853.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp6953292115063484853.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.glassfish_javax.json-1.0.4.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.glassfish_javax.json-1.0.4.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/org.antlr_ST4-4.0.8.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/org.antlr_ST4-4.0.8.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7481622034847054241.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7481622034847054241.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_ST4-4.0.8.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_ST4-4.0.8.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/org.antlr_antlr4-runtime-4.7.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/org.antlr_antlr4-runtime-4.7.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp8137443686268894230.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp8137443686268894230.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-runtime-4.7.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-runtime-4.7.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/org.antlr_antlr4-4.7.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/org.antlr_antlr4-4.7.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7474775280611839356.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7474775280611839356.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-4.7.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr4-4.7.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/io.delta_delta-core_2.12-1.0.0.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/io.delta_delta-core_2.12-1.0.0.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7899823112197045994.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp7899823112197045994.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/io.delta_delta-core_2.12-1.0.0.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/io.delta_delta-core_2.12-1.0.0.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/org.antlr_antlr-runtime-3.5.2.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/org.antlr_antlr-runtime-3.5.2.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp1438945171356535536.tmp
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/org.antlr_antlr-runtime-3.5.2.jar to class loader
21/11/06 07:28:33 INFO Executor: Fetching spark://macbook-pro:55507/jars/com.ibm.icu_icu4j-58.2.jar with timestamp 1636183711298
21/11/06 07:28:33 INFO Utils: Fetching spark://macbook-pro:55507/jars/com.ibm.icu_icu4j-58.2.jar to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp836117221049857229.tmp
21/11/06 07:28:33 INFO Utils: /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/fetchFileTemp836117221049857229.tmp has been previously copied to /private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/com.ibm.icu_icu4j-58.2.jar
21/11/06 07:28:33 INFO Executor: Adding file:/private/var/folders/qp/g1z81/T/spark-1270d/userFiles-2c8e44aa/com.ibm.icu_icu4j-58.2.jar to class loader
21/11/06 07:28:33 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55509.
21/11/06 07:28:33 INFO NettyBlockTransferService: Server created on macbook-pro:55509
...
21/11/06 07:28:33 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/sandbox/projects/delta/spark-warehouse/').
21/11/06 07:28:33 INFO SharedState: Warehouse path is 'file:/Users/sandbox/projects/delta/spark-warehouse/'.
21/11/06 07:28:34 INFO HiveUtils: Initializing HiveMetastoreConnection version 2.3.7 using Spark classes.
21/11/06 07:28:35 INFO HiveConf: Found configuration file null
21/11/06 07:28:35 INFO SessionState: Created HDFS directory: /tmp/hive/sandbox/eae25990-9fbc-48c4-abe3-425df7d94eda
21/11/06 07:28:35 INFO SessionState: Created local directory: /var/folders/qp/g1z81/T/sandbox/eae25990-9fbc-48c4-abe3-425df7d94eda
21/11/06 07:28:35 INFO SessionState: Created HDFS directory: /tmp/hive/sandbox/eae25990-9fbc-48c4-abe3-425df7d94eda/_tmp_space.db
21/11/06 07:28:35 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.7) is file:/Users/sandbox/projects/delta/spark-warehouse/
21/11/06 07:28:36 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
21/11/06 07:28:36 WARN HiveConf: HiveConf of name hive.stats.retries.wait does not exist
21/11/06 07:28:36 INFO HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
21/11/06 07:28:36 INFO ObjectStore: ObjectStore, initialize called
21/11/06 07:28:36 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
21/11/06 07:28:36 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
21/11/06 07:28:37 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
21/11/06 07:28:39 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
21/11/06 07:28:39 INFO ObjectStore: Initialized ObjectStore
21/11/06 07:28:39 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0
21/11/06 07:28:39 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore sandbox@192.168.1.207
21/11/06 07:28:39 INFO HiveMetaStore: Added admin role in metastore
21/11/06 07:28:39 INFO HiveMetaStore: Added public role in metastore
21/11/06 07:28:39 INFO HiveMetaStore: No user is added in admin role, since config is empty
21/11/06 07:28:39 INFO HiveMetaStore: 0: get_all_functions
21/11/06 07:28:39 INFO audit: ugi=sandbox ip=unknown-ip-addr cmd=get_all_functions
21/11/06 07:28:39 INFO HiveMetaStore: 0: get_database: default
21/11/06 07:28:39 INFO audit: ugi=sandbox ip=unknown-ip-addr cmd=get_database: default
21/11/06 07:28:39 INFO HiveUtils: Initializing execution hive, version 2.3.7
21/11/06 07:28:39 INFO SessionState: Created HDFS directory: /tmp/hive/sandbox/0911a389-88fd-467c-97a4-6d3f8ae8775d
21/11/06 07:28:39 INFO SessionState: Created local directory: /var/folders/qp/g1z81/T/sandbox/0911a389-88fd-467c-97a4-6d3f8ae8775d
21/11/06 07:28:39 INFO SessionState: Created HDFS directory: /tmp/hive/sandbox/0911a389-88fd-467c-97a4-6d3f8ae8775d/_tmp_space.db
21/11/06 07:28:39 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.7) is file:/Users/sandbox/projects/delta/spark-warehouse/
21/11/06 07:28:40 INFO SessionManager: Operation log root directory is created: /var/folders/qp/g1z81/T/sandbox/operation_logs
...
21/11/06 07:28:40 INFO AbstractService: Service:OperationManager is inited.
21/11/06 07:28:40 INFO AbstractService: Service:SessionManager is inited.
21/11/06 07:28:40 INFO AbstractService: Service: CLIService is inited.
21/11/06 07:28:40 INFO AbstractService: Service:ThriftBinaryCLIService is inited.
21/11/06 07:28:40 INFO AbstractService: Service: HiveServer2 is inited.
21/11/06 07:28:40 INFO AbstractService: Service:OperationManager is started.
21/11/06 07:28:40 INFO AbstractService: Service:SessionManager is started.
21/11/06 07:28:40 INFO AbstractService: Service: CLIService is started.
21/11/06 07:28:40 INFO AbstractService: Service:ThriftBinaryCLIService is started.
21/11/06 07:28:40 INFO ThriftCLIService: Starting ThriftBinaryCLIService on port 10000 with 5...500 worker threads
21/11/06 07:28:40 INFO AbstractService: Service:HiveServer2 is started.
21/11/06 07:28:40 INFO HiveThriftServer2: HiveThriftServer2 started
...
21/11/06 07:29:48 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
21/11/06 07:29:48 INFO ObjectStore: Initialized ObjectStore
21/11/06 07:29:48 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0
21/11/06 07:29:48 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore sandbox@192.168.1.207
21/11/06 07:29:48 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
21/11/06 07:29:49 INFO HiveMetaStore: Added admin role in metastore
21/11/06 07:29:49 INFO HiveMetaStore: Added public role in metastore
21/11/06 07:29:49 INFO HiveMetaStore: No user is added in admin role, since config is empty
21/11/06 07:29:49 INFO HiveMetaStore: 0: get_all_functions
21/11/06 07:29:49 INFO audit: ugi=sandbox ip=unknown-ip-addr cmd=get_all_functions
21/11/06 07:29:49 INFO SessionState: Created HDFS directory: /tmp/hive/anonymous/a8fabb9a-68ed-42f3-b4a0-86cbac11b27d
21/11/06 07:29:49 INFO SessionState: Created local directory: /var/folders/qp/g1z81/T/sandbox/a8fabb9a-68ed-42f3-b4a0-86cbac11b27d
21/11/06 07:29:49 INFO SessionState: Created HDFS directory: /tmp/hive/anonymous/a8fabb9a-68ed-42f3-b4a0-86cbac11b27d/_tmp_space.db
21/11/06 07:29:49 INFO HiveSessionImpl: Operation log session directory is created: /var/folders/qp/g1z81/T/sandbox/operation_logs/a8fabb9a-68ed-42f3-b4a0-86cbac11b27d
ANTLR Tool version 4.7 used for code generation does not match the current runtime version 4.8ANTLR Tool version 4.7 used for code generation does not match the current runtime version 4.821/11/06 07:29:50 INFO SessionState: Deleted directory: /tmp/hive/anonymous/a8fabb9a-68ed-42f3-b4a0-86cbac11b27d on fs with scheme file
21/11/06 07:29:50 INFO SessionState: Deleted directory: /var/folders/qp/g1z81/T/sandbox/a8fabb9a-68ed-42f3-b4a0-86cbac11b27d on fs with scheme file
21/11/06 07:29:50 INFO HiveMetaStore: 0: Cleaning up thread local RawStore...
21/11/06 07:29:50 INFO audit: ugi=anonymous ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
21/11/06 07:29:50 INFO HiveMetaStore: 0: Done cleaning up thread local RawStore
21/11/06 07:29:50 INFO audit: ugi=anonymous ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
21/11/06 07:29:50 WARN ThriftCLIService: Error opening session:
org.apache.hive.service.cli.HiveSQLException: Failed to open new session: org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'spark_catalog': org.apache.spark.sql.delta.catalog.DeltaCatalog
at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:85)
at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:204)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:371)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:243)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1497)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1482)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:53)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:823)
Caused by: org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'spark_catalog': org.apache.spark.sql.delta.catalog.DeltaCatalog
at org.apache.spark.sql.connector.catalog.Catalogs$.load(Catalogs.scala:66)
at org.apache.spark.sql.connector.catalog.CatalogManager.loadV2SessionCatalog(CatalogManager.scala:66)
at org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$v2SessionCatalog$2(CatalogManager.scala:85)
at org.apache.spark.sql.connector.catalog.CatalogManager$$Lambda$1357/0x0000000000000000.apply(Unknown Source)
at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
at org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$v2SessionCatalog$1(CatalogManager.scala:85)
at org.apache.spark.sql.connector.catalog.CatalogManager$$Lambda$1356/0x0000000000000000.apply(Unknown Source)
at scala.Option.map(Option.scala:230)
at org.apache.spark.sql.connector.catalog.CatalogManager.v2SessionCatalog(CatalogManager.scala:84)
at org.apache.spark.sql.connector.catalog.CatalogManager.catalog(CatalogManager.scala:50)
at org.apache.spark.sql.connector.catalog.CatalogManager.currentCatalog(CatalogManager.scala:117)
at org.apache.spark.sql.connector.catalog.LookupCatalog.currentCatalog(LookupCatalog.scala:35)
at org.apache.spark.sql.connector.catalog.LookupCatalog.currentCatalog$(LookupCatalog.scala:35)
...
我使用 Java 8 和 Spark 3.1.2,预构建 Hadoop 3.2 及更高版本和 Delta 1.0.0,在 Mac OSX 上运行
可能是什么问题?
最佳答案
一旦您可以将 io.delta:delta-core_2.12:1.0.0 JAR 文件复制到 $SPARK_HOME/lib 并重新启动,此错误就会消失。
关于apache-spark - 如何在本地模式下运行 Spark SQL Thrift Server 并使用 JDBC 连接到 Delta,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/69862388/
目前正在学习 Spark 的类(class)并了解到执行者的定义: Each executor will hold a chunk of the data to be processed. Thisc
阅读了有关 http://spark.apache.org/docs/0.8.0/cluster-overview.html 的一些文档后,我有一些问题想要澄清。 以 Spark 为例: JavaSp
Spark核心中的调度器与以下Spark Stack(来自Learning Spark:Lightning-Fast Big Data Analysis一书)中的Standalone Schedule
我想在 spark-submit 或 start 处设置 spark.eventLog.enabled 和 spark.eventLog.dir -all level -- 不要求在 scala/ja
我有来自 SQL Server 的数据,需要在 Apache Spark (Databricks) 中进行操作。 在 SQL Server 中,此表的三个键列使用区分大小写的 COLLATION 选项
所有这些有什么区别和用途? spark.local.ip spark.driver.host spark.driver.bind地址 spark.driver.hostname 如何将机器修复为 Sp
我有大约 10 个 Spark 作业,每个作业都会进行一些转换并将数据加载到数据库中。必须为每个作业单独打开和关闭 Spark session ,每次初始化都会耗费时间。 是否可以只创建一次 Spar
/Downloads/spark-3.0.1-bin-hadoop2.7/bin$ ./spark-shell 20/09/23 10:58:45 WARN Utils: Your hostname,
我是 Spark 的完全新手,并且刚刚开始对此进行更多探索。我选择了更长的路径,不使用任何 CDH 发行版安装 hadoop,并且我从 Apache 网站安装了 Hadoop 并自己设置配置文件以了解
TL; 博士 Spark UI 显示的内核和内存数量与我在使用 spark-submit 时要求的数量不同 更多细节: 我在独立模式下运行 Spark 1.6。 当我运行 spark-submit 时
spark-submit 上的文档说明如下: The spark-submit script in Spark’s bin directory is used to launch applicatio
关闭。这个问题是opinion-based .它目前不接受答案。 想改善这个问题吗?更新问题,以便可以通过 editing this post 用事实和引文回答问题. 6 个月前关闭。 Improve
我想了解接收器如何在 Spark Streaming 中工作。根据我的理解,将有一个接收器任务在执行器中运行,用于收集数据并保存为 RDD。当调用 start() 时,接收器开始读取。需要澄清以下内容
有没有办法在不同线程中使用相同的 spark 上下文并行运行多个 spark 作业? 我尝试使用 Vertx 3,但看起来每个作业都在排队并按顺序启动。 如何让它在相同的 spark 上下文中同时运行
我们有一个 Spark 流应用程序,这是一项长期运行的任务。事件日志指向 hdfs 位置 hdfs://spark-history,当我们开始流式传输应用程序时正在其中创建 application_X
我们正在尝试找到一种加载 Spark (2.x) ML 训练模型的方法,以便根据请求(通过 REST 接口(interface))我们可以查询它并获得预测,例如http://predictor.com
Spark newb 问题:我在 spark-sql 中进行完全相同的 Spark SQL 查询并在 spark-shell . spark-shell版本大约需要 10 秒,而 spark-sql版
我正在使用 Spark 流。根据 Spark 编程指南(参见 http://spark.apache.org/docs/latest/programming-guide.html#accumulato
我正在使用 CDH 5.2。我可以使用 spark-shell 运行命令。 如何运行包含spark命令的文件(file.spark)。 有没有办法在不使用 sbt 的情况下在 CDH 5.2 中运行/
我使用 Elasticsearch 已经有一段时间了,但使用 Cassandra 的经验很少。 现在,我有一个项目想要使用 Spark 来处理数据,但我需要决定是否应该使用 Cassandra 还是
我是一名优秀的程序员,十分优秀!