Spark SQL package not found

I am completely new to Spark and I am having the following problems: when I try to import SQLContext with:

import org.apache.spark.sql.SQLContext; 

or try initializing the SQLContext variable explicitly:

 SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc); 

I get an error from Eclipse:

Import org.apache.spark.sql.SQLContext could not be allowed

I put Spark in a dependency file, and everything else is fine, except for SQLContext. All code:

 package main.java; import java.io.Serializable; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.SQLContext; public class SparkTests { public static void main(String[] args){ SparkConf conf = new SparkConf().setAppName("SparkMain"); JavaSparkContext sc = new JavaSparkContext(conf); SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc); //DataFrame df = sqlContext System.out.println("\n\n\nHello world!\n\n\n"); } } 

When I try to compile it using mvn package , I get a compilation error:

org.apache.spark.sql package does not exist

Any ideas why the SQL package could not be found?

EDIT:

Pom.xml dependency file:

 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <groupId>edu.berkeley</groupId> <artifactId>simple-project</artifactId> <modelVersion>4.0.0</modelVersion> <name>Simple Project</name> <packaging>jar</packaging> <version>1.0</version> <dependencies> <dependency> <!-- Spark dependency --> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.6.1</version> </dependency> </dependencies> </project> 
+5
source share
1 answer

If you want to use Spark SQL or DataFrames in your project, you will have to add the spark-sql artifact as a dependency. In this particular case:

 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <!-- matching Scala version --> <version>1.6.1</version> <!-- matching Spark Core version --> </dependency> 

gotta do the trick.

+6
source

Source: https://habr.com/ru/post/1246047/


All Articles