Apache Spark 2.0 Library Preference

In short: In Spark 2.0, it spark-submitprefers its version of the Guava library (14.0.1) - but I want to use the latest version of jar (19.0).

Question: How to convince Spark to use the version provided in my pom.xmlfile?

My suspicion: I can use the option spark.driver.userClassPathFirst=true. But this is an experimental feature ( Spark 2.0.0 doc ) - so maybe there is a better solution?


Detailed explanation of the problem:

I am using Spark 2.0.0 (hadoop2.7) and Elasticsearch 2.3.4. And I'm fighting a very simple application that tries to use Spark Streaming and Elasticsearch together. There he is:

SparkConf sparkConf = new SparkConf().setAppName("SampleApp");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.milliseconds(500));
jssc.checkpoint("/tmp");
JavaDStream<String> messages = jssc.textFileStream("/some_directory_path");

TransportClient client = TransportClient.builder().build()
    .addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));

messages.foreachRDD(rdd -> {
    XContentBuilder builder = jsonBuilder()
            .startObject()
            .field("words", "some words")
            .endObject();

    clientTu.prepareIndex("indexName", "typeName")
        .setSource(builder.string())
        .get();
});

jssc.start();
jssc.awaitTermination();

The project is built with Maven. Here is the pom.xml part

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
        <exclusions>
            <exclusion>
                <groupId>com.google.guava</groupId>
                <artifactId>guava</artifactId>
            </exclusion>
        </exclusions>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
        <exclusions>
            <exclusion>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
            </exclusion>
        </exclusions>
    </dependency>

    <dependency>
        <groupId>org.elasticsearch</groupId>
        <artifactId>elasticsearch</artifactId>
        <version>2.3.4</version>
        <exclusions>
            <exclusion>
                <groupId>com.google.guava</groupId>
                <artifactId>guava</artifactId>
            </exclusion>
        </exclusions>
    </dependency>

    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-core</artifactId>
        <version>2.6.2</version>
    </dependency>

    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>19.0</version>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <createSourcesJar>true</createSourcesJar>
                        <transformers>
                            <transformer
                                implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
                            <transformer
                                implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>com.abc.App</mainClass>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

, . , :

spark-submit --class com.abc.App --master local[2]  /somePath/superApp-0.0.1-SNAPSHOT.jar

:

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
    at org.elasticsearch.threadpool.ThreadPool.<clinit>(ThreadPool.java:190)
    at org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:131)
    at com.abc.App.main(App.java:44)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

--conf spark.driver.userClassPathFirst=true spark-submit . , , .

, Spark , "uber" ( ). , , ?

: , , jar, POM, ( )?


: , Spark Streaming, Elasticsearch, (, io.netty:netty). spark.driver.userClassPathFirst .

+4

Source: https://habr.com/ru/post/1657159/


All Articles