Import Spark source code into intellj, build Error: not found: enter SparkFlumeProtocol and EventBatch

IntelliJ : 14.1.4

Sparks : source code version 1.5

I am importing Spark source code into IntellJ and following the steps of the Spark site .

When creating and compiling a project, the errors below occur. I searched googled and tried the one proposed here in the list of spark users to “Create Sources and Update Folders” in the Maven toolbar for the “Spark Project“ External Stock ”, but still with the same errors.

I am sure this is a resolution problem, since all other classes have been successfully resolved. Maybe I'm not using IntelliJ correctly? Any suggestions please? Thank you very much.

Error:(45, 66) not found: type SparkFlumeProtocol val transactionTimeout: Int, val backOffInterval: Int) extends SparkFlumeProtocol with Logging { ^ Error:(70, 39) not found: type EventBatch override def getEventBatch(n: Int): EventBatch = { 
+5
source share
1 answer

I solved the problem, and it turned out that the "Spark Project Source Drain" was excluded when importing the Spark source code into the default settings.

What I've done:

  • File → Project Structure → Modules → “Source Stream-flume-sink_2.10” → Sources
  • a folder is displayed in the tree, initially the "target" folder is excluded, but the classes "SparkFlumeProtocol" and "EventBatch" are compiled into this folder .
  • set the "target" folder as "Sources", then leave everything under "target" as "Excluded", except for "scala -2.10", see attached screenshot.

In this way, compilation classes are included, and classes can be correctly resolved after the reassembly project.

enter image description here

--- June 8, 2016 update --------

or, more specifically, the whole path of this module

pay attention to the type and color, this will affect the package name

 package org.apache.spark.streaming.flume.sink; 

enter image description here

+12
source

Source: https://habr.com/ru/post/1234409/


All Articles