I am using LinqToSQL to process data from SQL Server to dump it into an iSeries server for further processing. More details on that here.
My problem is that it is taking about 1.25 minutes to process those 350 rows of data. I am still trying to decipher the results from the SQL Server Profiler, but there are a TON of queries being run. Here is a bit more detail on what I am doing:
using (CarteGraphDataDataContext db = new CarteGraphDataDataContext())
{
var vehicles = from a in db.EquipmentMainGenerals
join b in db.EquipmentMainConditions on a.wdEquipmentMainGeneralOID equals b.wdEquipmentMainGeneralOID
where b.Retired == null
orderby a.VehicleId
select a;
et = new EquipmentTable[vehicles.Count()];
foreach (var vehicle in vehicles)
{
// Move data to the array
// Rates
GetVehcileRates(vehicle.wdEquipmentMainGeneralOID);
// Build the costs accumulators
GetPartsAndOilCosts(vehicle.VehicleId);
GetAccidentAndOutRepairCosts(vehicle.wdEquipmentMainGeneralOID);
// Last Month's Accumulators
et[i].lastMonthActualGasOil = GetFuel(vehicle.wdEquipmentMainGeneralOID) + Convert.ToDecimal(oilCost);
et[i].lastMonthActualParts = Convert.ToDecimal(partsCost);
et[i].lastMonthActualLabor = GetLabor(vehicle.VehicleId);
et[i].lastMonthActualOutRepairs = Convert.ToDecimal(outRepairCosts);
et[i].lastMonthActualAccidentCosts = Convert.ToDecimal(accidentCosts);
// Move more data to the array
i++;
}
}
The Get methods all look similar to:
private void GetPartsAndOilCosts(string vehicleKey)
{
oilCost = 0;
partsCost = 0;
using (CarteGraphDataDataContext db = new CarteGraphDataDataContext())
{
try
{
var costs = from a in db.WorkOrders
join b in db.MaterialLogs on a.WorkOrderId equals b.WorkOrder
join c in db.Materials on b.wdMaterialMainGeneralOID equals c.wdMaterialMainGeneralOID
where (monthBeginDate.Date <= a.WOClosedDate && a.WOClosedDate <= monthEndDate.Date) && a.EquipmentID == vehicleKey
group b by c.Fuel into d
select new
{
isFuel = d.Key,
totalCost = d.Sum(b => b.Cost)
};
foreach (var cost in costs)
{
if (cost.isFuel == 1)
{
oilCost = (double)cost.totalCost * (1 + OVERHEAD_RATE);
}
else
{
partsCost = (double)cost.totalCost * (1 + OVERHEAD_RATE);
}
}
}
catch (InvalidOperationException e)
{
oilCost = 0;
partsCost = 0;
}
}
return;
}
My thinking here is cutting down the number of queries to the DB should speed up the processing. If LINQ does a SELECT for every record, maybe I need to load every record into memory first.
I still consider myself a beginner with C# and OOP in general (I do mostly RPG programming on the iSeries). So I am guessing I am doing something stupid. Can you help me fix my stupidity (at least with this problem)?
Update: Thought I would come back and update you on what I have discovered. It appears like the database was poorly designed. Whatever LINQ was generating in the background it was highly inefficient code. I am not saying the LINQ is bad, it just was bad for this database. I converted to a quickly thrown together .XSD setup and the processing time went from 1.25 minutes to 15 seconds. Once I do a proper redesign, I can only guess I'll shave a few more seconds off of that. Thank you all for you comments. I'll try LINQ again some other day on a better database.