I have a shell script. To this script I am passing arguments from a file. This file contains tables names
The script is working fine. I am able execute the command for all the tables in the file.
shell script
#!/bin/bash
[ $# -ne 1 ] && { echo "Usage : $0 input file "; exit 1; }
input_file=$1
TIMESTAMP=`date "+%Y-%m-%d"`
touch /home/$USER/logs/${TIMESTAMP}.success_log
touch /home/$USER/logs/${TIMESTAMP}.fail_log
success_logs=/home/$USER/logs/${TIMESTAMP}.success_log
failed_logs=/home/$USER/logs/${TIMESTAMP}.fail_log
#Function to get the status of the job creation
function log_status
{
status=$1
message=$2
if [ "$status" -ne 0 ]; then
echo "`date +\"%Y-%m-%d %H:%M:%S\"` [ERROR] $message [Status] $status : failed" | tee -a "${failed_logs}"
#echo "Please find the attached log file for more details"
#exit 1
else
echo "`date +\"%Y-%m-%d %H:%M:%S\"` [INFO] $message [Status] $status : success" | tee -a "${success_logs}"
fi
}
while read table ;do
sqoop job --exec $table > /home/$USER/logging/"${table}_log" 2>&1
done < ${input_file}
g_STATUS=$?
log_status $g_STATUS "Sqoop job ${table}"
I am trying to collect the status logs
for the script.
I want to collect the status logs
for each table individually.
what I want
2017-04-28 20:36:41 [ERROR] sqoop job table1 EXECUTION [Status] 2 : failed
2017-04-28 20:36:41 [ERROR] sqoop job table2 EXECUTION [Status] 2 : failed
What I am getting
If the script for last table fails
2017-04-28 20:38:41 [ERROR] sqoop job EXECUTION [Status] 2 : failed
If the script for the last table is successful then
2017-04-28 20:40:41 [ERROR] sqoop job [Status] 0 : success
What am I doing wrong and what changes should I make to get desired results.
Change
to
The
while
loop only runs the code within thewhile
anddone
lines. So to log for all tables, you need to run the logging inside the while loop.Also,
$table
changes in iteration of the loop, so any command that you want to run on all tables, you need to run inside the loop.