I have a problem with a function withColumnin a Spark-Scala environment. I would like to add a new column to my DataFrame as follows:
+---+----+---+
| A| B| C|
+---+----+---+
| 4|blah| 2|
| 2| | 3|
| 56| foo| 3|
|100|null| 5|
+---+----+---+
steel:
+---+----+---+-----+
| A| B| C| D |
+---+----+---+-----+
| 4|blah| 2| 750|
| 2| | 3| 750|
| 56| foo| 3| 750|
|100|null| 5| 750|
+---+----+---+-----+
column D in a single value repeated N-time for each row in my DataFrame.
The code:
var totVehicles : Double = df_totVehicles(0).getDouble(0);
The variable totVehicles returns the correct value, it works!
The second DataFrame should calculate 2 fields (id_zipcode, n_vehicles) and add a third column (with the same value of -750):
var df_nVehicles =
df_carPark.filter(
substring($"id_time",1,4) < 2013
).groupBy(
$"id_zipcode"
).agg(
sum($"n_vehicles") as 'n_vehicles
).select(
$"id_zipcode" as 'id_zipcode,
'n_vehicles
).orderBy(
'id_zipcode,
'n_vehicles
);
Finally, I add a new column using the function withColumn:
var df_nVehicles2 = df_nVehicles.withColumn(totVehicles, df_nVehicles("n_vehicles") + df_nVehicles("id_zipcode"))
But Spark returns me this error:
error: value withColumn is not a member of Unit
var df_nVehicles2 = df_nVehicles.withColumn(totVehicles, df_nVehicles("n_vehicles") + df_nVehicles("id_zipcode"))
Could you help me? Thank you very much!