Yes, you can restart spark applications. There are several options available that apply to the cluster manager in use. For example, with a stand-alone Spark cluster with cluster deployment mode, you can also specify - monitor to make sure that the driver automatically reboots if it fails with a non-zero exit code. To list all such options available for spark-submit, run it with --help:
Running on a stand-alone Spark cluster in cluster deployment mode with control
./bin/spark-submit \ --class org.apache.spark.examples.SparkPi \ --master spark:
source share