TL DR . How to connect a local driver to a spark switch through a SOCKS proxy.
We have a spark cluster in place that sits behind a firewall that blocks most ports. We have ssh access, so I can create a SOCKS proxy with ssh -D 7777 ...
It works great for viewing the web interface when my browser uses a proxy server, but I do not know how to use its local driver.
So far I have had this, which obviously does not configure the proxy:
val sconf = new SparkConf() .setMaster("spark://masterserver:7077") .setAppName("MySpark") new SparkContext(sconf)
What information for these messages is sent 16 times before throwing an exception.
15/01/20 14:43:34 INFO Remoting: Starting remoting 15/01/20 14:43:34 ERROR NettyTransport: failed to bind to server-name/ip.ip.ip.ip:0, shutting down Netty transport 15/01/20 14:43:34 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 15/01/20 14:43:34 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 15/01/20 14:43:34 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 15/01/20 14:43:34 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
source share