I have the following class definition:
public class MyData
{
public DateTime Expiration { get; private set; }
public string Name { get; private set; }
public double Price { get; private set; }
public double Quantity { get; private set; }
}
I need to join this with a list of years:
IEnumerable<int> years= Enumerable.Range(1, 20);
The result will be ultimately be displayed in a grid with the y-axis representing the Name
field and the x-axis representing the years. Each cell will be the aggregated Quantity
* Price
for the Name
and year.
I am currently struggling with the syntax. I started off by joining the instances of MyData
to the years and grouping as follows:
var myData = GetData();
var query = from data in myData
join year in years on (data.Expiration.Year - DateTime.Now.Year) + 1 equals year
group data by new { Year = (data.Expiration.Year - DateTime.Now.Year) + 1
, Name = data.Name } into grouped
select new {Name = grouped.Key.Name
, Year = grouped.Key.Year
, Value = grouped.Sum(d => d.Quanity * d.Price) };
This gives me the data aggregated as I need, but obviously excludes any years where none of the instances of MyData
contain a matching Expiration
.
I can't seem to figure out how to modify my query to get the data I need. Once I get the years included my aggregation breaks down, and I effectively end up with the sum of all the Price
's * Quantity
's for a Name
across all years, rather than on a year by year basis.