How to redirect entire output of spark-submit to a

2020-07-05 06:03发布

问题:

So, I am trying to redirect the output of an apache spark-submit command to text file but some output fails to populate file. Here is the command I am using:

spark-submit something.py > results.txt

I can see the output in the terminal but I do not see it in the file. What am I forgetting or doing wrong here?

Edit:

If I use

spark-submit something.py | less

I can see all the output being piped into less

回答1:

spark-submit prints most of it's output to STDERR

To redirect the entire output to one file, you can use:

spark-submit something.py > results.txt 2>&1

Or

spark-submit something.py &> results.txt


回答2:

If you are running the spark-submit on a cluster the logs are stored with the application Id. You can see the logs once the application finishes.

yarn logs --applicationId <your applicationId> > myfile.txt

Should fetch you the log of your job

The applicationId of your job is given when you submit the spark job. You will be able to see that in the console where you are submitting or from the Hadoop UI.