I am trying to add a column in a Hive table if the source data has new columns. All the detection of new columns works well, however, when I try to add the column to the destination table, I receive this error:
for (f <- df.schema.fields) {
if ("[" + f.name + "]"==chk) {
spark.sqlContext.sql("alter table dbo_nwd_orders add columns (" + f.name + " " + f.dataType.typeName.replace("integer", "int") + ")")
}
}
Error:
WARN HiveExternalCatalog: Could not alter schema of table `default`.`dbo_nwd_orders` in a Hive compatible way. Updating Hive metastore in Spark SQL specific format
InvalidOperationException(message:partition keys can not be changed.)
However, if I catch the alter sentence generated and execute it from hive GUI (HUE), I can add it without issues.
alter table dbo_nwd_orders add columns (newCol int)
Why that sentence is valid from the GUI and not from spark code?
Thank you very much.