I used the AWS online console to start my cluster with Apache Spark. I have a thick jar based on my Spark app and I loaded it into an S3 bucket. When I try to send it as a Step with a Custom Jar , the process fails. Any pointers would be greatly appreciated.
source share