Is it possible to connect any RDBMS through usinig java spark?

JdbcRDD rdd = new org.apache.spark.rdd.JdbcRDD( sparkConf, ()= > { Class.forName ("com.mysql.jdbc.Driver") sql.DriverManager.getConnection("jdbc:mysql://mysql.example.com/?user=batman&password=alfred") }, "SELECT * FROM BOOKS WHERE ? <= KEY AND KEY <= ?", 0, 1000, 10, row = > row.getString("BOOK_TITLE") ) 

I tried above scala java 8 code change classes, but so many errors.

+6
source share
1 answer

I used to encounter the same problem, it turns out that this is a SQL parameter problem, basically you need to use sql: sql select * from books limit ?, ? two parameters for lowerBound and upperBound , which are required by the JdbcRdd constructor.

+1
source

Source: https://habr.com/ru/post/974086/


All Articles