Can I read a CSV represented as a string in Apache Spark using spark-csv

I know how to read a csv file into sparks using spark-csv ( https://github.com/databricks/spark-csv ), but I already have a csv file presented as a string and would like to convert this string directly to dataframe . Is it possible?

+4
source share
3 answers

Update: starting with Spark 2.2.x, finally, the right way to do this is with Dataset.

import org.apache.spark.sql.{Dataset, SparkSession}
val spark = SparkSession.builder().appName("CsvExample").master("local").getOrCreate()

import spark.implicits._
val csvData: Dataset[String] = spark.sparkContext.parallelize(
  """
    |id, date, timedump
    |1, "2014/01/01 23:00:01",1499959917383
    |2, "2014/11/31 12:40:32",1198138008843
  """.stripMargin.lines.toList).toDS()

val frame = spark.read.option("header", true).option("inferSchema",true).csv(csvData)
frame.show()
frame.printSchema()

Old spark options

, . CsvParser. , 1.6.0 spark-csv_2.10-1.4.0

    import com.databricks.spark.csv.CsvParser

val csvData = """
|userid,organizationid,userfirstname,usermiddlename,userlastname,usertitle
|1,1,user1,m1,l1,mr
|2,2,user2,m2,l2,mr
|3,3,user3,m3,l3,mr
|""".stripMargin
val rdd = sc.parallelize(csvData.lines.toList)
val csvParser = new CsvParser()
  .withUseHeader(true)
  .withInferSchema(true)


val csvDataFrame: DataFrame = csvParser.csvRdd(sqlContext, rdd)
+5

csv, , , scala-csv:

val myCSVdata : Array[List[String]] = myCSVString.split('\n').flatMap(CSVParser.parseLine(_))

, , , ..

RDD :

val myCSVRDD : RDD[List[String]] = sparkContext.parallelize(msCSVdata)

case, csv. Person :

https://spark.apache.org/docs/latest/sql-programming-guide.html#inferring-the-schema-using-reflection

.

DataFrame:

import spark.implicits._ myCSVDataframe = myCSVRDD.toDF()

+4

The accepted answer did not work for me in spark 2.2.0, but led me to what I needed with csvData.lines.toList

val fileUrl = getClass.getResource(s"/file_in_resources.csv")
val stream = fileUrl.getContent.asInstanceOf[InputStream]
val streamString = Source.fromInputStream(stream).mkString

val csvList = streamString.lines.toList

spark.read
  .option("header", "true")
  .option("inferSchema", "true")
  .csv(csvList.toDS())
  .as[SomeCaseClass]  
+1
source

Source: https://habr.com/ru/post/1652252/


All Articles