Try setting the encr SPARK_LOCAL_IP
Spark variable to the local IP address.
In my case, I ran Spark on an instance of Amazon EC2 Linux. spark-shell
stops working with an error message similar to yours. I was able to fix this by adding a spark-env.conf
parameter similar to the following to the Spark configuration file .
export SPARK_LOCAL_IP=172.30.43.105
You can also set it to ~ / .profile or ~ / .bashrc.
Also check host settings in /etc/hosts
source
share