No suitable drivers for jdbc found in Spark

I use

df.write.mode("append").jdbc("jdbc:mysql://ip:port/database", "table_name", properties) 

to insert into a table in MySQL.

In addition, I added Class.forName("com.mysql.jdbc.Driver") to my code.

When I submit the Spark app:

 spark-submit --class MY_MAIN_CLASS --master yarn-client --jars /path/to/mysql-connector-java-5.0.8-bin.jar --driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar MY_APPLICATION.jar 

This mode of work with yarn-client works for me.

But when I use direct cluster mode:

 spark-submit --class MY_MAIN_CLASS --master yarn-cluster --jars /path/to/mysql-connector-java-5.0.8-bin.jar --driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar MY_APPLICATION.jar 

This does not work. I also tried installing "--conf":

 spark-submit --class MY_MAIN_CLASS --master yarn-cluster --jars /path/to/mysql-connector-java-5.0.8-bin.jar --driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar --conf spark.executor.extraClassPath=/path/to/mysql-connector-java-5.0.8-bin.jar MY_APPLICATION.jar 

but still get the error "There is no suitable driver for jdbc."

+5
source share
2 answers

There are 3 possible solutions,

  • You might want to build your application with your build manager (Maven, SBT), so you do not need to add dependencies to your spark-submit cli.
  • You can use the following parameter in spark-submit cli:

     --jars $(echo ./lib/*.jar | tr ' ' ',') 

    Explanation: Suppose that you have all your banks in the lib directory in the root of your project, this will read all the libraries and add them to the submit application.

  • You can also try to configure these 2 variables: spark.driver.extraClassPath and spark.executor.extraClassPath in the SPARK_HOME/conf/spark-default.conf file and specify the value of these variables as the path to the jar file. Verify that the same path exists on the work nodes.

+2
source

I tried the suggestions given here that did not work for me (with mysql). During debugging through the DriverManager code, I realized that I need to register my driver, since this does not happen automatically using "spark-submit". Therefore I added

 Driver driver = new Driver(); 

The constructor registers the driver using DriverManager, which solved the SQLException problem for me.

0
source

Source: https://habr.com/ru/post/1240517/


All Articles