Oozie supress logging from shell job action?

2019-03-04 07:48发布

I have a simple workflow (see below) which runs a shell script. The shell script runs pyspark script, which moves file from local to hdfs folder.

When I run the shell script itself, it works perfectly, logs are redirect to a folder by > spark.txt 2>&1 right in the shell script.

But when I submit oozie job with following workflow, output from shell seems to be supressed. I tried to redirect all possible oozie logs (-verbose -log) > oozie.txt 2>&1, but it didn't help.

The workflow is finished successfuly (status SUCCESSEDED, no error log), but I see, the folder is not copied to hdfs, however when I run it alone (not through oozie), everything is fine.

<action name="forceLoadFromLocal2hdfs">
<shell xmlns="uri:oozie:shell-action:0.1">
  <job-tracker>${jobTracker}</job-tracker>
  <name-node>${nameNode}</name-node>
  <configuration>
    <property>
      <name>mapred.job.queue.name</name>
      <value>${queueName}</value>
    </property>
  </configuration>
  <exec>driver-script.sh</exec>
  <argument>s</argument>
  <argument>script.py</argument>
  <!-- arguments for py script -->
  <argument>hdfsPath</argument>
  <argument>localPath</argument>
  <file>driver-script.sh#driver-script.sh</file>
</shell>
<ok to="end"/>
<error to="killAction"/>

Thx a lot!

EDIT: Thx to the advice I found full log under the

yarn -logs -applicationId [application_xxxxxx_xxxx] 

1条回答
萌系小妹纸
2楼-- · 2019-03-04 08:37

Thx to the advice I found full log under the

yarn -logs -applicationId [application_xxxxxx_xxxx] 
查看更多
登录 后发表回答