you need to split on escape for special characters since split accepts regex
.split("\\|\\|")
converting to CSV is difficult because data strings can potentially contain delimiters (in quotation marks), new or other syntax characters, so I would recommend using spark-csv
val df = sqlContext.read .format("com.databricks.spark.csv") .option("delimiter", "||") .option("header", "true") .option("inferSchema", "true") .load("words.csv")
and
words.write .format("com.databricks.spark.csv") .option("delimiter", "||") .option("header", "true") .save("words.csv")
source share