Spark SQL (since version 1.6) does not support bind variables.
ps. What Ashrit offers is not a binding variable. You create a line every time. Each time, Spark will analyze the request, create an execution plan, etc. The purpose of binding variables (for example, in RDBMS systems) is to reduce the time it takes to create an execution plan (which can be expensive with a large number of connections, etc.). Spark should have a special API for parsing the request, and then for binding the variables. Spark does not have this functionality (today is the release of Spark 1.6).
Update 8/2018 : since Spark 2.3 in Spark (so far) there are no binding variables.
source share