I use
df.write.mode("append").jdbc("jdbc:mysql://ip:port/database", "table_name", properties)
to insert into a table in MySQL.
In addition, I added Class.forName("com.mysql.jdbc.Driver") to my code.
When I submit the Spark app:
spark-submit --class MY_MAIN_CLASS --master yarn-client --jars /path/to/mysql-connector-java-5.0.8-bin.jar --driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar MY_APPLICATION.jar
This mode of work with yarn-client works for me.
But when I use direct cluster mode:
spark-submit --class MY_MAIN_CLASS --master yarn-cluster --jars /path/to/mysql-connector-java-5.0.8-bin.jar --driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar MY_APPLICATION.jar
This does not work. I also tried installing "--conf":
spark-submit --class MY_MAIN_CLASS --master yarn-cluster --jars /path/to/mysql-connector-java-5.0.8-bin.jar --driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar --conf spark.executor.extraClassPath=/path/to/mysql-connector-java-5.0.8-bin.jar MY_APPLICATION.jar
but still get the error "There is no suitable driver for jdbc."
source share