I'm seeing a weird behavior in my Entity Framework model. I've got a query that looks like this:
var rows = ( from alarm in context.Alarms
join temp in context.ListDetails on alarm.ListDetailId equals temp.ListDetailId into entries from entry in entries.DefaultIfEmpty()
join read in context.Reads on alarm.ReadId equals read.ReadId
join plate in context.Images on alarm.ReadId equals plate.ReadId
where alarm.IActive == 1 && ! alarm.TransmittedAlarm
where read.IActive == 1
where plate.IActive == 1 && plate.ImageTypeId == 2
select new { alarm, entry, read, plate } ).ToArray();
The query returns all columns in alphabetical order by column name. It turns out that this column is NULL for a few rows in the result set. When I expand the rows variable in the debugger, I see that the entire row is null!
EDIT: Some clarification.
By "first column", I mean the first column of the first row, i.e., in "SELECT A, B, C FROM ...", I mean A. It just happens that the query that Entity Framework builds returns all of the columns in the joined result set in alphabetical order, and the first one alphabetically is nullable and is null for some rows.
The column in question is not a primary key; if it were a primary key, it couldn't be null.
When Entity Framework processes the rows of the returned data into objects, it's looking at the value of the first column in each row. If that column is null, it's returning null for the row, instead of an object with the property that corresponds to that column set to null.
I don't believe this has anything to do specifically with a left outer join; it just happens that my query uses one. I haven't done any testing to verify this, however, so it's just a supposition.
Has anybody seen this before? Does anyone have a fix for this?
Tony
I can confirm that I have exactly same trouble: when SQL query formed by EF has
null
in first column of result, it return null instead of entity. It means exactly following:If we take another criteria, which does not result in
null
in first column,t
is normal non-null
entity.It seems to me that it is some kind of obscure behavior/bug in EF.
Update
After several hours of probing I've found following behavior. EF does return
null
for entites havingNULL
in first column of result (which breaks expected behavior), but it has several notions about what column to put first in the select list. So, after making my model precisely consistent with my database state (which means denoting allNULLABLE
database columns and replacingrequired
navigation properties withoptional
) - I am using Code First, so there're plenty of places to be explicit about database - I've managed to get it to work.It means that you should check store-level definition of your model (depending on what paradigm you use, designer-based, or code-based, it would be different places) against your database.
You use
DefaultIfEmpty
in your join. It meansIn this case EF generates a
LEFT JOIN
. In fact, this code is the well-known way to generate aLEFT JOIN
with Linq2Sql or EF.The
LEFT JOIN
statement selectsNULL
values for all columns of the table if no row in the table matches theJOIN
criterion. In fact, the only column that matters is the PK column of the entity. EF checks if the PK value isNULL
, decides that no entity exists and putsnull
intoentries
. Then you get thisnull
as theentry
property value of the results.