티스토리 뷰

728x90

https://stackoverflow.com/questions/38709280/how-to-limit-the-number-of-retries-on-spark-job-failure

 

How to limit the number of retries on Spark job failure?

We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure. How can I stop it from having attempt #2 in case of yarn container failure or

stackoverflow.com

 

728x90
댓글