gpt4 book ai didi

mysql - Spark : com. mysql.jdbc.Driver 不允许创建表作为选择

转载 作者:行者123 更新时间:2023-11-29 02:20:20 24 4
gpt4 key购买 nike

尝试通过 spark 保存到 MySQL 数据库时出现以下错误:

Py4JJavaError: An error occurred while calling o41.saveAsTable.
: java.lang.RuntimeException: com.mysql.jdbc.Driver does not allow create table as select.
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:242)
at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:218)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1091)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:745)

我的 python 代码:

res.saveAsTable(tableName='test.provider_phones', source='com.mysql.jdbc.Driver',driver='com.mysql.jdbc.Driver', mode='append', url='jdbc:mysql://host.amazonaws.com:port/test?user=user&password=pass')

无论表是否已经存在,都会发生这种情况。

我正在使用 spark 1.3.1

最佳答案

您可以使用 createJDBCTable(url: String, table: String, allowExisting: Boolean)insertIntoJDBC(url: String, table: String, overwrite: Boolean) 函数DataFrame 的。

http://www.sparkexpert.com/2015/04/17/save-apache-spark-dataframe-to-database/

关于mysql - Spark : com. mysql.jdbc.Driver 不允许创建表作为选择,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32997014/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com