How to increase the default precision and scale wh

2019-03-05 13:28发布

Trying to load a data from oracle table where I have few columns hold floating point values , some times it holds upto DecimalType(40,20) i.e. 20 digits after point. Currently when I load its columns using

var local_ora_df: DataFrameReader = ora_df;
      local_ora_df.option("partitionColumn", "FISCAL_YEAR")  
       local_ora_df
          .option("schema",schema)
          .option("dbtable", query)
          .load()

It is holding 10 digits after point i.e. decimal(38,10) (nullable = true) If I want to increase digits after point while reading from oracle using spark-sql what should I do ?

1条回答
相关推荐>>
2楼-- · 2019-03-05 13:43

We can use .option("customSchema", "data DECIMAL(38, 15)) to increase it to 15 digits after point.

查看更多
登录 后发表回答