External Table not getting updated from parquet fi

2019-07-18 08:04发布

I am using spark streaming to write the aggregated output as parquet files to the hdfs using SaveMode.Append. I have an external table created like :

CREATE TABLE if not exists rolluptable
USING org.apache.spark.sql.parquet
OPTIONS (
  path "hdfs:////"
);

I had an impression that in case of external table the queries should fetch the data from newly parquet added files also. But, seems like the newly written files are not being picked up.

Dropping and recreating the table every time works fine but not a solution.

Please suggest how can my table have the data from newer files also.

1条回答
Fickle 薄情
2楼-- · 2019-07-18 08:36

Are you reading those tables with spark? if so, spark caches parquet tables metadata (since schema discovery can be expensive)

To overcome this, you have 2 options:

  1. Set the config spark.sql.parquet.cacheMetadata to false
  2. refresh the table before the query: sqlContext.refreshTable("my_table")

See here for more details: http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-metastore-parquet-table-conversion

查看更多
登录 后发表回答