"Invalid job type for this context" error in a spark SQL job with Spark job server

I create a spark SQL job with a spark job server and use the HiveContext as shown below:   https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/HiveTestJob .scala

I managed to start the server, but when I start the application (my Scala class which extends SparkSqlJob), I get the following as an answer:

{
   "status": "ERROR",

   "result": "Invalid job type for this context"
 }

Can someone tell me what is going wrong, or provide a detailed procedure for setting up a job server for SparkSQL?

Code below:

import com.typesafe.config.{Config, ConfigFactory}
import org.apache.spark._
import org.apache.spark.sql.hive.HiveContext
import spark.jobserver.{SparkJobValid, SparkJobValidation, SparkHiveJob}

object newHiveRest extends SparkHiveJob {


  def validate(hive: HiveContext, config: Config): SparkJobValidation = SparkJobValid

  def runJob(hive: HiveContext, config: Config): Any = {

    hive.sql(s"use default")
    val maxRdd = hive.sql(s"select count(*) from 'default'.'passenger'")

    maxRdd.count()
  }
}
+4
source share

Source: https://habr.com/ru/post/1626124/


All Articles