Sqoop function '--map-column-hive' being i

2019-09-04 03:15发布

I am trying to import a file into hive as parquet and the --map-column-hive column_name=timestamp is being ignored. The column 'column_name' is originally of type datetime in sql and it converts it into bigint in parquet. I want to convert it to timestamp format through sqoop but it is not working.

sqoop import \

--table table_name \

--driver com.microsoft.sqlserver.jdbc.SQLServerDriver \

--connect jdbc:sqlserver://servername \

--username user --password pw \

--map-column-hive column_name=timestamp\

--as-parquetfile \

--hive-import \

--hive-table table_name -m 1

When I view the table in hive, it still shows the column with its original datatype.

I tried column_name=string and that did not work either.

I think this may be an issue with converting files to parquet but I am not sure. Does anyone have a solution to fix this?

I get no errors when running the command, it just completes the import as if the command was did not exist.

1条回答
看我几分像从前
2楼-- · 2019-09-04 03:40

Before hive 1.2 version Timestmap support in ParquetSerde is not avabile. Only binary data type support is available in 1.1.0.

Please check the link

Please upgrade your version to 1.2 and after ,it should work.

Please check the issue log and release notes below.

https://issues.apache.org/jira/browse/HIVE-6384

https://issues.apache.org/jira/secure/ReleaseNote.jspa?version=12329345&styleName=Text&projectId=12310843
查看更多
登录 后发表回答