I have a dataframe that contains a really large integer value, for example:
42306810747081022358
When I tried to convert it to long, it worked in Java, but not under a spark, I got
NumberFormatException: For input string("42306810747081022358")
Then I tried to convert it to Decimal (BigDecimal) value. Again, you can easily do this in Java, but in Spark: dframe.withColumn ("c_number", col ("c_a"). Cast (new DecimalType ()));
This way, I am not getting any exceptions, however I can see that all values ββof the result are zero.
I also tried to use UDF for this purpose, but I get the same results:
UDF1 cTransformer = new UDF1<String, BigDecimal>() {
@Override
public BigDecimal call(String aString) throws Exception {
return new BigDecimal(aString);
}
};
sqlContext.udf().register("cTransformer", cTransformer, new DecimalType());
dframe = dframe.withColumn("c_number", callUDF("cTransformer", dframe.col("c_a")));
And here again, all I get is a column with all zeros.
How should I proceed?