I have a general question about Apache Spark:
We have sparking scenarios that consume Kafka messages. Problem: they accidentally execute without a specific error ...
Some scripts do nothing while they work, when I run them manually, one of them does not work with this message:
SparkUI ERROR: Failed to bind SparkUI java.net.BindException: address already in use: "SparkUI" service failed after 16 attempts!
So I'm wondering if there could be a specific way to run scripts in parallel?
All of them are in one bank, and I start them using Supervisor. Spark is installed on Cloudera Manager 5.4 on yarn.
This is how I run the script:
sudo -u spark spark-submit --class org.soprism.kafka.connector.reader.TwitterPostsMessageWriter /home/soprism/sparkmigration/data-migration-assembly-1.0.jar --master yarn-cluster --deploy-mode client
Thank you for your help!
Update: I changed the command and now run it (it stops working with a specific message):
root@ns6512097 :~
source share