Trying to load a data from oracle table where I have few columns hold floating point values , some times it holds upto DecimalType(40,20) i.e. 20 digits after point. Currently when I load its columns using
var local_ora_df: DataFrameReader = ora_df;
local_ora_df.option("partitionColumn", "FISCAL_YEAR")
local_ora_df
.option("schema",schema)
.option("dbtable", query)
.load()
It is holding 10 digits after point i.e. decimal(38,10) (nullable = true) If I want to increase digits after point while reading from oracle using spark-sql what should I do ?
We can use .option("customSchema", "data DECIMAL(38, 15)) to increase it to 15 digits after point.