gpt4 book ai didi

apache-spark - SPARK 1.6 插入现有 Hive 表(未分区)

转载 作者:行者123 更新时间:2023-12-02 01:20:16 25 4
gpt4 key购买 nike

假设我可以让下面的这些单例插入语句像另一个堆栈溢出问题一样工作(谢谢),那么

  val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql("CREATE TABLE IF NOT EXISTS e360_models.employee(id INT, name STRING, age INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'")

sqlContext.sql("insert into table e360_models.employee select t.* from (select 1210, 'rahul', 55) t")
sqlContext.sql("insert into table e360_models.employee select t.* from (select 1211, 'sriram pv', 35) t")
sqlContext.sql("insert into table e360_models.employee select t.* from (select 1212, 'gowri', 59) t")

val result = sqlContext.sql("FROM e360_models.employee SELECT id, name, age")
result.show()

如果想从一个注册为临时表的 SPARK DF 中插入选择到一个已经存在的 Hive 表怎么办?我似乎无法让它工作。事实上可能吗?

使用 1.6 SPARK。对按 CTAS 创建表不感兴趣,而是按上述插入,但批量插入,例如

sqlContext.sql("INSERT INTO TABLE default.ged_555 SELECT t.* FROM mytempTable t")

最佳答案

As I understood you want to insert some data in to e360_models.employee and then you want to select some columns and again insert in to default.ged_555 and also you don't want to do CTAS Prepare a dataframe from e360_models.employee and then do like below

// since you are using hive I used hiveContext below... 
val dataframe = hiveContext.sql("select * from e360_models.employee ");

df.show(10) // to verify whether data is there in dataframe or not



df.printSchema(); // print schema as well for debug purpose.
dataframe.write.mode(SaveMode.OverWrite).insertInto("default.ged_555")

val sampleDataFrame = hiveContext.sql("select * from default.get_555");

// again do print 10 records to verify your result for debug purpose
sampleDataFrame.show()
// again print schema of the target table
sampleDataFrame.printSchema()

关于apache-spark - SPARK 1.6 插入现有 Hive 表(未分区),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40563984/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com