How to add hbase-site.xml configuration file using spark shell

I have the following simple code:

import org.apache.hadoop.hbase.client.ConnectionFactory
import org.apache.hadoop.hbase.HBaseConfiguration
val hbaseconfLog = HBaseConfiguration.create()
val connectionLog = ConnectionFactory.createConnection(hbaseconfLog)

What I run on the spark shell and I get the following error:

 14:23:42 WARN zookeeper.ClientCnxn: Session 0x0 for server null, unexpected 
error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:30)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081)

Many of these errors are in fact and some of them from time to time:

14:23:46 WARN client.ZooKeeperRegistry: Can't retrieve clusterId from 
Zookeeper org.apache.zookeeper.KeeperException$ConnectionLossException: 
KeeperErrorCode = ConnectionLoss for /hbase/hbaseid

Through Cloudera VM, I can solve this by simply restarting hbase-master, regionserver and thrift, but here, in my company, I am not allowed to do this, I also solved this once by copying the hbase-site.xml file to fix the conf directory , but I also can’t do this, is there a way to set the path for this particular file in the parameters of the spark shell?

+5
source share
1 answer

1) make sure your zoo is working

2) hbase-site.xml hbase-site.xml /etc/spark/conf , hive-site.xml /etc/spark/conf Hive.

3) export SPARK_CLASSPATH=/a/b/c/hbase-site.xml;/d/e/f/hive-site.xml

hortonworks..

spark-shell hbase-site.xml

3 spark-shell

val conf = HBaseConfiguration.create()
  conf.addResource(new Path("/home/spark/development/hbase/conf/hbase-site.xml"))
   conf.set(TableInputFormat.INPUT_TABLE, table_name)
0

Source: https://habr.com/ru/post/1681900/


All Articles