I am building a web application which searches its SQL
database for the query entered by the user. A simple search engine.
Lets propose I have plenty of rows - above a couple of thousands. I want to make the call efficient.
I know that my call is possible with only one* SQL
query. I would like to know if Linq-To-Sql
does this type of optimization.
I am aware the it uses Lazy
for its queries. I cannot benchmark it now because I don't have enought data on my database.
foreach(var word in words)
{
var niqquh = number == 666 ? "" : Languages[number];
var S = db.Uploads.Where(w => number == 666 ? true : (w.Language == niqquh));
if(S.Count() > 20)
S = S.Take(20);
S = S.OrderBy(u => (u.Tags.Contains(word) ? 15 : 0) + (u.Name.Contains(word) ? 10 : 0) + (u.Description.Contains(word) ? 5 : 0));
listOfList.Add(S.ToList());
}
As you see here, I have the 'magic number' 666
which is used to say "whatevery language is ok".
Then, I want to take only the top 20 elements, so I call Count()
and then Take
.
Then, I sort the results by my needs.
The point is, that I think that the lazy
expression gets evaluated at Count
- therefore it cannot be optimized to contain only the top 20 rows at the SQL Query
level - but on the client's (ASP.NET) level. therefore, I basically download the whole database to the client, which is terrible.
Does it get optimized or not, if it doesn't, how can I do it in an efficient way, even if I have to go back to plain SQL
string statements?
*A query for each word