Apache Spark Driver does not see external jar

I get this error:

java.lang.ClassNotFoundException: com.mysql.jdbc.Driver

When I try to save something in the MySQL database from the driver. Of the subordinates, I do not have this problem, because I added SparkConf.setJars. I tried to add a JavaSparkContext.addJarparameter spark.driver.extraLibraryPathfrom the code and file spark-defaults.conf, as well as a parameter --jarsduring application submission. These actions did not solve my problem, I would really appreciate any ideas or advice.

+4
source share
1 answer

Try using the maven-assembly-plugin , which will bundle your code and all the dependencies in one JAR, which you then send to Spark.

+1
source

Source: https://habr.com/ru/post/1598520/


All Articles