0
点赞
收藏
分享

微信扫一扫

spark-sql create table using orc

先峰老师 2022-03-30 阅读 67
hivesparksql

在分区表中使用不同建表方式,hive和spark-sql中insert overwrite data现象不同。

1.问题出现过程

1)spark-sql建表

CREATE TABLE t_using
(`user_no` STRING, `tt_slotid_req` STRING, `ks_slotid_req` STRING
,`model` STRING
, `follow_time` STRING, `bind_time` STRING, `reg_slotid` STRING
, `geo_show` STRING, `ip_show` STRING, `model_show` STRING, `tt_bid_mean` STRING, `pday` STRING
)
USING orc
OPTIONS (
  `serialization.format` '1'
)
PARTITIONED BY (pday);

2)spark-sql插入数据

insert overwrite table t_using partition(pday) 
select * from  t0
where pday>='20220327' ;

3)spark-sql再次插入数据

insert overwrite table t_using partition(pday) 
select * from  t0
where pday='20220325' ;

现象t_using表中pday>='20220327'被删除,只存在最近一次插入的数据pday='20220325'

4) hive中插入数据

insert overwrite table t_using partition(pday) 
select * from  t0
where pday='20220326' ;

现象:t_using表中pday='20220325'未被删除,并新增分区pday='20220326'

5) hive建表

CREATE TABLE t_using
(`user_no` STRING, `tt_slotid_req` STRING, `ks_slotid_req` STRING
,`model` STRING
, `follow_time` STRING, `bind_time` STRING, `reg_slotid` STRING
, `geo_show` STRING, `ip_show` STRING, `model_show` STRING, `tt_bid_mean` STRING, `pday` STRING
)
USING orc
OPTIONS (
  `serialization.format` '1'
)
PARTITIONED BY (pday);

该方式在hive中并不能建表;

6) 常见建表语句

CREATE TABLE t_using
(`user_no` STRING, `tt_slotid_req` STRING, `ks_slotid_req` STRING
,`model` STRING
, `follow_time` STRING, `bind_time` STRING, `reg_slotid` STRING
, `geo_show` STRING, `ip_show` STRING, `model_show` STRING, `tt_bid_mean` STRING, `pday` STRING
)
PARTITIONED BY (pday)

stored as orc;;

hive和spark-sql均正常建表,且插入数据正常;

7) 总结

问题总结:在using orc建表(或其他using data source方式)下,hive和spark-sql insert overwrite table数据现象不一样。

2.问题解决

查询了很久,发现这个其实是spark-sql的语法,hive并没有:

INSERT OVERWRITE - Spark 3.0.0-preview Documentation

要想分区不被删除,需要在插入数据时指定插入的分区:

insert overwrite table t_using partition(pday='20220326') 
select * from  t0
where pday='20220326' ;

3.补充

另外spark-sql建表使用USING xxx建表的我也贴一下:



CREATE DATASOURCE TABLE - Spark 3.2.1 Documentation


 

举报

相关推荐

0 条评论