I am building a web application which searches its SQL
database for the query entered by the user. A simple search engine.
Lets propose I have plenty of rows - above a couple of thousands. I want to make the call efficient.
I know that my call is possible with only one* SQL
query. I would like to know if Linq-To-Sql
does this type of optimization.
I am aware the it uses Lazy
for its queries. I cannot benchmark it now because I don't have enought data on my database.
foreach(var word in words)
{
var niqquh = number == 666 ? "" : Languages[number];
var S = db.Uploads.Where(w => number == 666 ? true : (w.Language == niqquh));
if(S.Count() > 20)
S = S.Take(20);
S = S.OrderBy(u => (u.Tags.Contains(word) ? 15 : 0) + (u.Name.Contains(word) ? 10 : 0) + (u.Description.Contains(word) ? 5 : 0));
listOfList.Add(S.ToList());
}
As you see here, I have the 'magic number' 666
which is used to say "whatevery language is ok".
Then, I want to take only the top 20 elements, so I call Count()
and then Take
.
Then, I sort the results by my needs.
The point is, that I think that the lazy
expression gets evaluated at Count
- therefore it cannot be optimized to contain only the top 20 rows at the SQL Query
level - but on the client's (ASP.NET) level. therefore, I basically download the whole database to the client, which is terrible.
Does it get optimized or not, if it doesn't, how can I do it in an efficient way, even if I have to go back to plain SQL
string statements?
*A query for each word
You are right, it will need to execute the statement to get the count. You don't need to check the count, however. So, change
to
It could be rewritten like this:
This runs a query, yes...
So you didn't download the whole table.
This does not execute the query. It does limit the (unordered) query to 20 items. Generally you want to apply order before you limit the result.
There is no exception if you Take more than is there. You'll just get fewer items. Because of this, the Count is not needed.
This runs a query.