Spark Error: invalid log directory /app/spark/spar

2019-07-13 20:29发布

enter image description here

My spark application is failing with the above error.

Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.

My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.

Can anyone give me clue on why i am getting this error?

1条回答
在下西门庆
2楼-- · 2019-07-13 21:01

In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true

in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.

Just comment that line and see if it helps.

查看更多
登录 后发表回答