I am working on SPARK 1.6.1 using SCALA and am facing an unusual problem. When you create a new column using an existing column created during the same execution, you get "org.apache.spark.sql.AnalysisException".
WORKING: .
val resultDataFrame = dataFrame.withColumn("FirstColumn",lit(2021)).withColumn("SecondColumn",when($"FirstColumn" - 2021 === 0, 1).otherwise(10))
resultDataFrame.printSchema().
DOES NOT WORK
val resultDataFrame = dataFrame.withColumn("FirstColumn",lit(2021)).withColumn("SecondColumn",when($"FirstColumn" - **max($"FirstColumn")** === 0, 1).otherwise(10))
resultDataFrame.printSchema().
Here I create my SecondColumn using the FirstColumn created during the same execution. The question is why it does not work when using the avg / max functions. Please let me know how I can solve this problem.
source
share