I also use SparkLauncher from the Spring application. The following is a brief overview of the approach I took (following the examples in JavaDoc).
The @Service service used to run the job also implements SparkHandle.Listener and passes the link to itself through .startApplication, for example.
... ... @Service public class JobLauncher implements SparkAppHandle.Listener { ... ... ... private SparkAppHandle launchJob(String mainClass, String[] args) throws Exception { String appResource = getAppResourceName(); SparkAppHandle handle = new SparkLauncher() .setAppResource(appResource).addAppArgs(args) .setMainClass(mainClass) .setMaster(sparkMaster) .setDeployMode(sparkDeployMode) .setSparkHome(sparkHome) .setConf(SparkLauncher.DRIVER_MEMORY, "2g") .startApplication(this); LOG.info("Launched [" + mainClass + "] from [" + appResource + "] State [" + handle.getState() + "]"); return handle; } @Override public void infoChanged(SparkAppHandle handle) { LOG.info("Spark App Id [" + handle.getAppId() + "] Info Changed. State [" + handle.getState() + "]"); } @Override public void stateChanged(SparkAppHandle handle) { LOG.info("Spark App Id [" + handle.getAppId() + "] State Changed. State [" + handle.getState() + "]"); }
Using this approach, one can act when the state changes to “FAILED”, “FINISHED” or “KILLED”.
I hope this information helps you.
source share