I'm trying to create a fairly large table. ~3 millions rows and ~40K columns using hive. To begin, I'm creating an empty table and inserting the data into the table.
However, I hit an error when trying this.
Unable to acquire IMPLICIT, SHARED lock default after 100 attempts. FAILED: Error in acquiring locks: Locks on the underlying objects cannot be acquire. retry after some time
The query is pretty straightforward:
create external table database.dataset (
var1 decimal(10,2),
var2 decimal(10,2),
...
var40000 decimal(10,2)
) location 'hdfs://nameservice1/root/user1/project1';
Anybody seen this error before? Cloudera says there are no limits on number of columns, but clearly hitting some system limitation here.
Additionally, I can create a smaller hive table in the specified location.