Calculate runtime for spark sql

I am trying to run several spark SQL statements and want to calculate their runtime.

One solution is to access the journal. I am wondering if there are other simpler methods for this. Something like the following:

import time startTimeQuery = time.clock() df = sqlContext.sql(query) df.show() endTimeQuery = time.clock() runTimeQuery = endTimeQuery - startTimeQuery 
+5
source share
1 answer

If you use a scala, you can try to define a synchronization function as follows:

 def show_timing[T](proc: => T): T = { val start=System.nanoTime() val res = proc // call the code val end = System.nanoTime() println("Time elapsed: " + (end-start)/1000 + " microsecs") res } 

Then you can try:

 val df = show_timing{sqlContext.sql(query)} 
+8
source

Source: https://habr.com/ru/post/1242569/


All Articles