How to configure hbase in spark mode?

What are the steps to connect a spark to hbase?

I have primary addresses for both. Am I just adding the hbase address to the spark class path?

+5
source share
1 answer

This Spark With HBase connection message should be useful: http://www.vidyasource.com/blog/Programming/Scala/Java/Data/Hadoop/Analytics/2014/01/25/lighting-a-spark-with-hbase

Am I just adding the hbase address to the spark class path?

Not. In fact, you should put the hbase configuration files in the spark class path. If not, you should install them in your codes, for example:

Configuration hConf = HBaseConfiguration.create(conf); hConf.set("hbase.zookeeper.quorum", "PDHadoop1.corp.CompanyName.com,PDHadoop2.corp.CompanyName.com"); hConf.setInt("hbase.zookeeper.property.clientPort", 10000); 
+2
source

Source: https://habr.com/ru/post/1201763/


All Articles