How to pass date into shell script for a sqoop com

2019-09-16 06:17发布

Im working on a sqoop import with the following command:

#!/bin/bash
    while IFS=":" read -r server dbname table; do
    sqoop eval --connect jdbc:mysql://$server/$dbname --username root --password cloudera --table mydata --hive-import --hive-table dynpart --check-column id --last-value $(hive -e "select max(id) from dynpart"); --hive-partition-key 'thisday' --hive-partition-value '01-01-2016'
done<tables.txt

Im doing the partition for everyday. Hive table:

create table dynpart(id int, name char(30), city char(30))
  partitioned by(thisday char(10))
  row format delimited
  fields terminated by ','
  stored as textfile
  location '/hive/mytables'
  tblproperties("comment"="partition column: thisday structure is dd-mm-yyyy");

But I don't want to give the partition value directly as I want to create a sqoop job and run it everyday. In the script, how can I pass the date value to sqoop command dynamically (format: dd/mm/yyyy) instead of giving it directly ? Any help is appreciated.

标签: shell hadoop
2条回答
甜甜的少女心
2楼-- · 2019-09-16 07:05

you can use the shell command date to get it (ubuntu 14.04):

$ date +%d/%m/%Y
22/03/2017
查看更多
做自己的国王
3楼-- · 2019-09-16 07:10

You Can try the below Code

#!/bin/bash
DATE=$(date +"%d-%m-%y")
while IFS=":" read -r server dbname table; do
sqoop eval --connect jdbc:mysql://$server/$dbname --username root --password cloudera --table mydata --hive-import --hive-table dynpart --check-column id --last-value $(hive -e "select max(id) from dynpart"); --hive-partition-key 'thisday' --hive-partition-value $DATE
done<tables.txt

Hope this Helps

查看更多
登录 后发表回答