_logger.info("data size : "+saleData.size);
saleData.parallelStream().forEach(data -> {
SaleAggrData saleAggrData = new SaleAggrData() {
{
setCatId(data.getCatId());
setRevenue(RoundUpUtil.roundUpDouble(data.getRevenue()));
setMargin(RoundUpUtil.roundUpDouble(data.getMargin()));
setUnits(data.getUnits());
setMarginRate(ComputeUtil.marginRate(data.getRevenue(), data.getMargin()));
setOtd(ComputeUtil.OTD(data.getRevenue(), data.getUnits()));
setSaleDate(data.getSaleDate());
setDiscountDepth(ComputeUtil.discountDepth(data.getRegularPrice(), data.getRevenue()));
setTransactions(data.getTransactions());
setUpt(ComputeUtil.UPT(data.getUnits(), data.getTransactions()));
}
};
salesAggrData.addSaleAggrData(saleAggrData);
});
The Issue with code is that when I am getting an response from DB, and while iterating using a parallel stream, the data size is different every time, while when using a sequential stream it's working fine.
I can't use a sequential Stream
because the data is huge and it's taking time.
Any lead would be helpful.