Based on R, I'm used to doing column operations easily. Is there an easy way to take this function that I wrote in scala
def round_tenths_place( un_rounded:Double ) : Double = {
val rounded = BigDecimal(un_rounded).setScale(1, BigDecimal.RoundingMode.HALF_UP).toDouble
return rounded
}
And apply it to one column of the data frame - sort of like I was hoping this would do:
bid_results.withColumn("bid_price_bucket", round_tenths_place(bid_results("bid_price")) )
I have not found an easy way and am struggling to figure out how to do this. There should be an easier way than converting the dataframe to and RDD, and then selecting from rdd rows to get the correct field and matching the function for all values, right? And also something more concise creating the SQL table and then doing it with sparkSQL UDF?