Web application did not launch Spark using SparkLauncher

I want to run a Spark job from a Javaee web application, but SparkLauncher started without exception, and the job was not started in the Spark cluster. Can anybody help?

public static void runJob(String userId) throws Exception {
    long previous = System.currentTimeMillis();
    logger.info("initialize spark context...");
    init("spark-cluster-test.properties"); 
    Process spark = new SparkLauncher()
            .setSparkHome(spark_home)
            .setMaster(spark_master)
            .setAppName(app_name+timestamp)
            .setAppResource(jar_file)
            .setMainClass(main_class)
            .setConf(SparkLauncher.DRIVER_MEMORY, spark_driver_memory)
            .setConf("spark.network.timeout",spark_network_timeout)
            .setConf(SparkLauncher.DRIVER_EXTRA_CLASSPATH, class_path)
            .setConf(SparkLauncher.EXECUTOR_EXTRA_CLASSPATH, class_path)
            .setDeployMode(spark_deploy_mode)
            .addAppArgs(userId)
            .launch();
    spark.waitFor();
    logger.info("spark job is returned after " + (System.currentTimeMillis() - previous) + " miliseconds.");
}

Web server log:

2016-01-20 10:27:17 -67129627 [http-bio-8088-exec-127] INFO    - SparkJobLauncher is started with userId=102
2016-01-20 10:27:17 -67129628 [http-bio-8088-exec-127] INFO    - initialize spark context...
2016-01-20 10:27:19 -67130838 [http-bio-8088-exec-127] INFO    - spark job is returned after 1 seconds.
2016-01-20 10:27:19 -67130839 [http-bio-8088-exec-127] DEBUG   - Creating a new SqlSession

But in the Spark cluster, no work starts. And the spark.waitFor () method returns immediately after the call, which should run within a few seconds.

+4
source share

Source: https://habr.com/ru/post/1625061/


All Articles