SemanticException Partition spec {col=null} contai

2019-06-26 09:18发布

I am trying to create dynamic partitions in hive using following code.

SET hive.exec.dynamic.partition = true;
SET hive.exec.dynamic.partition.mode = nonstrict;

create external table if not exists report_ipsummary_hourwise(
ip_address string,imp_date string,imp_hour bigint,geo_country string)
PARTITIONED BY (imp_date_P string,imp_hour_P string,geo_coutry_P string) 
row format delimited 
fields terminated by '\t'
stored as textfile
location 's3://abc';

insert overwrite table report_ipsummary_hourwise PARTITION (imp_date_P,imp_hour_P,geo_country_P)
SELECT ip_address,imp_date,imp_hour,geo_country,
       imp_date as imp_date_P,
       imp_hour as imp_hour_P,
       geo_country as geo_country_P
FROM report_ipsummary_hourwise_Temp;

Where report_ipsummary_hourwise_Temp table contains following columns, ip_address,imp_date,imp_hour,geo_country.

I am getting this error

SemanticException Partition spec {imp_hour_p=null, imp_date_p=null, geo_country_p=null} contains non-partition columns.

Can anybody suggest why this error is coming ?

4条回答
Juvenile、少年°
2楼-- · 2019-06-26 09:48

I was facing the same error. It's because of the extra characters present in the file. Best solution is to remove all the blank characters and reinsert if you want.

查看更多
Root(大扎)
3楼-- · 2019-06-26 09:50

You insert sql have the geo_country_P column but the target table column name is geo_coutry_P. miss a n in country

查看更多
我欲成王,谁敢阻挡
4楼-- · 2019-06-26 09:51

It says while copying the file from result to hdfs jobs could not recognize the partition location. What i can suspect you have table with partition (imp_date_P,imp_hour_P,geo_country_P) whereas job is trying to copy on imp_hour_p=null, imp_date_p=null, geo_country_p=null which doesn't match..try to check hdfs location...the other point what i can suggest not to duplicate column name and partition twice

查看更多
Melony?
5楼-- · 2019-06-26 10:01

It could also be https://issues.apache.org/jira/browse/HIVE-14032

INSERT OVERWRITE command failed with case sensitive partition key names

There is a bug in Hive which makes partition column names case-sensitive.

For me fix was that both column name has to be lower-case in the table and PARTITION BY clause's in table definition has to be lower-case. (they can be both upper-case too; due to this Hive bug HIVE-14032 the case just has to match)

查看更多
登录 后发表回答