How to convert column values ​​from row to decimal?

I have a dataframe that contains a really large integer value, for example:

42306810747081022358

When I tried to convert it to long, it worked in Java, but not under a spark, I got

   NumberFormatException: For input string("42306810747081022358")

Then I tried to convert it to Decimal (BigDecimal) value. Again, you can easily do this in Java, but in Spark: dframe.withColumn ("c_number", col ("c_a"). Cast (new DecimalType ()));

This way, I am not getting any exceptions, however I can see that all values ​​of the result are zero.

I also tried to use UDF for this purpose, but I get the same results:

UDF1 cTransformer = new UDF1<String, BigDecimal>() {
        @Override
        public BigDecimal call(String aString) throws Exception {
            return new BigDecimal(aString);
        }
    };
sqlContext.udf().register("cTransformer", cTransformer, new DecimalType());
dframe = dframe.withColumn("c_number", callUDF("cTransformer", dframe.col("c_a"))); 

And here again, all I get is a column with all zeros.

How should I proceed?

+4
2

Try:

dframe.withColumn("c_number", dframe.col("c_a").cast("decimal(38,0)"))
+4

A Decimal , 10, 0.
- . 10 , .

, , :

dframe.withColumn("c_number", dframe.col("c_a").cast(new DecimalType(38,0)))

, 38

0

Source: https://habr.com/ru/post/1658733/


All Articles