Yes, you can send Spark SQL queries through Livy. However, there is currently no support for self-submitted queries. They will need to be wrapped in Python or Scala code.
Here are two examples of executing Spark SQL queries using Python to interact with Livy using the lib and Scala queries of the code as the string that should be executed "in spark":
1) using the magic of% json live ( https://github.com/apache/incubator-livy/blob/412ccc8fcf96854fedbe76af8e5a6fec2c542d25/repl/src/test/scala/org/apache/livy/repl/PythonInterpter
session_url = host + "/sessions/1" statements_url = session_url + '/statements' data = { 'code': textwrap.dedent("""\ val d = spark.sql("SELECT COUNT(DISTINCT food_item) FROM food_item_tbl") val e = d.collect %json e """)} r = requests.post(statements_url, data=json.dumps(data), headers=headers) print r.json()
2) using% table magic in livy ( https://github.com/apache/incubator-livy/blob/412ccc8fcf96854fedbe76af8e5a6fec2c542d25/repl/src/test/scala/org/apache/livy/repl/Pythonpeerrereterpreter
session_url = host + "/sessions/21" statements_url = session_url + '/statements' data = { 'code': textwrap.dedent("""\ val x = List((1, "a", 0.12), (3, "b", 0.63)) %table x """)} r = requests.post(statements_url, data=json.dumps(data), headers=headers) print r.json()
source share