I have a df with a schema, also create a table in HBase with phoenix. I want to save this df in HBase using a spark. I tried the descriptions in the following link and ran a spark shell with phoenix plugin dependencies.
spark-shell
However, I received an error message even when I run the read function;
val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "INPUT_TABLE", | "zkUrl" -> hbaseConnectionString)) java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame
I feel like I'm wrong. Therefore, if there is another way to put data generated by a spark in HBase, I would appreciate it if you share it with me.
https://phoenix.apache.org/phoenix_spark.html
source share