Error starting spark shell

I just downloaded the latest spark, and when I started the spark shell, I got the following error:

java.net.BindException: Failed to bind to: /192.168.1.254:0: Service 'sparkDriver' failed after 16 retries!
    at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)

...
...

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
...
...
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^

Is there something I missed when setting up the spark?

+5
source share
3 answers

Try setting the encr SPARK_LOCAL_IPSpark variable to the local IP address.

In my case, I ran Spark on an instance of Amazon EC2 Linux. spark-shellstops working with an error message similar to yours. I was able to fix this by adding a spark-env.confparameter similar to the following to the Spark configuration file .

export SPARK_LOCAL_IP=172.30.43.105

You can also set it to ~ / .profile or ~ / .bashrc.

Also check host settings in /etc/hosts

+4
source

. SPARK-8162.

, 1.4.1 1.5.0 - , , (1.4.0 ).

+1

. .bashrc

export SPARK_LOCAL_IP=172.30.43.105

cd $HADOOP_HOME/bin

hdfs dfsadmin -safemode leave

namenode.

metastore_db /bin. , .

,

spark-shell --master "spark://localhost:7077"

voila sqlContext.implicits._.

+1

Source: https://habr.com/ru/post/1015997/


All Articles