How to save data frame in HBase?

I have a df with a schema, also create a table in HBase with phoenix. I want to save this df in HBase using a spark. I tried the descriptions in the following link and ran a spark shell with phoenix plugin dependencies.

spark-shell --jars ./phoenix-spark-4.8.0-HBase-1.2.jar,./phoenix-4.8.0-HBase-1.2-client.jar,./spark-sql_2.11-2.0.1.jar 

However, I received an error message even when I run the read function;

 val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "INPUT_TABLE", | "zkUrl" -> hbaseConnectionString)) java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame 

I feel like I'm wrong. Therefore, if there is another way to put data generated by a spark in HBase, I would appreciate it if you share it with me.

https://phoenix.apache.org/phoenix_spark.html

+5
source share

Source: https://habr.com/ru/post/1259104/


All Articles