For non-kerberized cluster : export HADOOP_USER_NAME=zorrobefore submitting jobs, Spark will do the trick.
Then make sure that unset HADOOP_USER_NAMEif you want to return to the default credentials in the rest of the shell script (or in an interactive shell session).
For a clustered cluster, the clean way to impersonate another account without parsing your other tasks / sessions (which probably depends on your default ticket) will be something on this line ...
export KRB5CCNAME=FILE:/tmp/krb5cc_$(id -u)_temp_$$
kinit -kt ~/.protectedDir/zorro.keytab zorro@MY.REALM
spark-submit ...........
kdestroy
source
share