Consider the following code, where dbContext
is a SQL Server database context and Examples
is a DbSet
:
this.dbContext.Examples.Take(5).ToList(); Enumerable.Take(this.dbContext.Examples, 5).ToList();
The first line works as expected and is converted to SQL in the following manner:
SELECT TOP(5) * FROM Examples
However, the second line first fetches all rows and applies the Take
operator afterwards. Why is that?
Since I am using expressions to build a dynamic lambda I have to use the second approach (Enumerable.Take
):
var call = Expression.Call( typeof(Enumerable), "Take", new[]{ typeof(Examples) }, contextParam, Expression.Constant(5) );
Unfortunately, the first approach does not work when working with expressions and the current architecture of the program forces me to build a lambda dynamically.
Why does the second approach fetches all rows and how can I prevent it in order to use it in expressions efficiently?
Advertisement
Answer
You’re not calling the same method. The first line is invoking Queryable.Take
, not Enumerable.Take
.
Since DbSet
implements both IQueryable<>
and IEnumerable<>
, but IQueryable<>
implements IEnumerable<>
, the compiler treats IQueryable<>
as a more specific type. So when it’s resolving the Take
extension method to call, it determines that Queryable.Take(...)
is the right one, because it requires an IQueryable<>
as the first parameter.
This is important because the IQueryable<>
interface is what allows LINQ queries to be built as expression trees that get evaluated into SQL. The moment you switch to treating an IQueryable<>
as an IEnumerable<>
, you lose that behavior and switch to only being able to iterate over the results of whatever query had been built prior to that.
Try this:
Queryable.Take(this.dbContext.Examples, 5).ToList();
or this:
var call = Expression.Call( typeof(Queryable), "Take", new[]{ typeof(Examples) }, contextParam, Expression.Constant(5) );