EF6 Disable Query Plan Caching with Command Tree I

2019-04-03 18:20发布

I'm using IDbCommandTreeInterceptor to implement soft-delete functionality. Inside standard TreeCreated method I check whether given query command contains models with soft-delete attribute. If they do and user requested to fetch soft deleted object too --- I call my soft-delete visitor with querySoftDeleted = true. This will make my query return all object, those with true and those with false values on IsDeleted property.

public class SoftDeleteInterceptor : IDbCommandTreeInterceptor {
    public void TreeCreated(DbCommandTreeInterceptionContext interceptionContext) {
        ...            

        bool shouldFetchSoftDeleted = context != null && context.ShouldFetchSoftDeleted;

        this.visitor = new SoftDeleteQueryVisitor(ignoredTypes, shouldFetchSoftDeleted);

        var newQuery = queryCommand.Query.Accept(this.visitor);

        ...
    }
}


public class SoftDeleteQueryVisitor {

    ...

    public override DbExpression Visit(DbScanExpression expression)
    {
        // Skip filter if all soft deleted items should be fetched
        if (this.shouldFetchSoftDeleted)
            return base.Visit(expression);

        ...
        // TODO Apply `IsDeleted` filter.
    }
}

The problem arises when I try to retrieve all objects (soft-deleted too) and then with the same query later object that are not deleted only. Something like this:

context.ShouldFetchSoftDeleted = true;
var retrievedObj= context.Objects.Find(obj.Id);

And then in new instance of context (not in same context)

var retrievedObj= context.Objects.Find(obj.Id);

Second time, ShouldFetchSoftDeleted is set to false, everything is great, but EF decides that this query was same as one before and retrieves it from cache. Retrieved query does not contain filter and thus returns all objects (soft-deleted and not). Cache is not cleared when context is disposed.

Now the question is whether there is a way, ideally, to mark constructed DbCommand so that it does not get cached. Can this be done? Or is there a way to force query recompilation?

There are ways to avoid caching, but I would rather not have to change every query in application just to fix this.

More info on Query Plan Caching can be found here.

Edit 1

I'm using new context for each request - object caching should not be the problem.

Edit 2

Here is database log. First call is with soft-delete and second is w/o. ... parts are identical so I excluded them from log. You can see that both requests are identical. First one calls CreateTree and resulted tree is cached so that when you execute, tree is retrieved from cache and my soft-delete flag is not re-applied when it should be.

Opened connection at 16.5.2015. 2:34:25 +02:00

SELECT 
    [Extent1].[Id] AS [Id], 
    [Extent1].[IsDeleted] AS [IsDeleted], 
    ...
    FROM [dbo].[Items] AS [Extent1]
    WHERE [Extent1].[Id] = @p__linq__0


-- p__linq__0: '1' (Type = Int64, IsNullable = false)

-- Executing at 16.5.2015. 2:34:25 +02:00

-- Completed in 22 ms with result: SqlDataReader



Closed connection at 16.5.2015. 2:34:25 +02:00

The thread 0x1008 has exited with code 259 (0x103).
The thread 0x1204 has exited with code 259 (0x103).
The thread 0xf94 has exited with code 259 (0x103).
Opened connection at 16.5.2015. 2:34:32 +02:00

SELECT 
    [Extent1].[Id] AS [Id], 
    [Extent1].[IsDeleted] AS [IsDeleted], 
    ...
    FROM [dbo].[Items] AS [Extent1]
    WHERE [Extent1].[Id] = @p__linq__0


-- p__linq__0: '1' (Type = Int64, IsNullable = false)

-- Executing at 16.5.2015. 2:34:32 +02:00

-- Completed in 16 ms with result: SqlDataReader



Closed connection at 16.5.2015. 2:34:32 +02:00

'vstest.executionengine.x86.exe' (CLR v4.0.30319: UnitTestAdapter: Running test): Loaded 'C:\Windows\assembly\GAC_MSIL\Microsoft.VisualStudio.DebuggerVisualizers\12.0.0.0__b03f5f7f11d50a3a\Microsoft.VisualStudio.DebuggerVisualizers.dll'. Cannot find or open the PDB file.

As I already stated, I executed each request in its own context like so:

        using (var context = new MockContext())
        {
            // Test overrided behaviour 
            // This should return just deleted entity
            // Enable soft-delete retrieval
            context.ShouldFetchSoftDeleted = true;

            // Request 1 goes here
            // context.Items.Where(...).ToList()
        }

        using (var context = new MockContext())
        {
            // Request 2 goes here
            // context.Items.Where(...).ToList()
        }

4条回答
做自己的国王
2楼-- · 2019-04-03 18:50

I know this question was asked some time ago, but since no post was marked as answer i share my recent experiences with that issue. As Dave correctly explained "It's important to distinguish between Query Plan Caching and Result Caching". Here the problem is Query Plan Caching, because

A Query Cache is an optimized SQL instruction plan. These plans help make EF queries faster than "Cold" Queries. These Plans are cached beyond and particular context.

So even if you create a new context, the problem remains, because the interceptor is not applied to "Query Plan cached" queries. My solution to this is very simple. Instead of relying only on the Interceptor, we can add a Where clause to the query in case of ShouldFetchSoftDeleted = true. With that EF uses another query and does not reuse the wrongly cached one. The interceptor now will be called but ShouldFetchSoftDeleted = true prevents your QueryVisitor to apply the IsDeleted filter.

I'm using a Repository pattern, but I think the concept is clear.

public override IQueryable<TEntity> Find<TEntity>()
{
   var query = GetRepository<TEntity>().Find();

   if (!ShouldFetchSoftDeleted)
   {
     return query; // interceptor handles soft delete
   }

   query = GetRepository<TEntity>().Find();
   return Where<IDeletedInfo, TEntity>(query, x => x.IsDeleted == false || x.IsDeleted);
}
查看更多
聊天终结者
3楼-- · 2019-04-03 18:56

It's important to distinguish between Query Plan Caching and Result Caching:

Caching in the Entity Framework

Query Plan Caching

The first time a query is executed, it goes through the internal plan compiler to translate the conceptual query into the store command (e.g. the T-SQL which is executed when run against SQL Server). If query plan caching is enabled, the next time the query is executed the store command is retrieved directly from the query plan cache for execution, bypassing the plan compiler.

The query plan cache is shared across ObjectContext instances within the same AppDomain. You don't need to hold onto an ObjectContext instance to benefit from query plan caching.

A Query Cache is an optimized SQL instruction plan. These plans help make EF queries faster than "Cold" Queries. These Plans are cached beyond and particular context.

Object caching:

By default when an entity is returned in the results of a query, just before EF materializes it, the ObjectContext will check if an entity with the same key has already been loaded into its ObjectStateManager. If an entity with the same keys is already present EF will include it in the results of the query. Although EF will still issue the query against the database, this behavior can bypass much of the cost of materializing the entity multiple times.

In other words, Object Caching is a soft form of Results Caching. No other kind 2nd Level Cache is available with Entity Framework unless you specifically include it. Second-Level Caching in the Entity Framework and Azure

AsNoTracking

Returns a new query where the entities returned will not be cached in the DbContext or ObjectContext

Context.Set<Objects>().AsNoTracking();

Or you can disable object caching for an entity using MergeOption NoTracking Option:

Will not modify cache.

context.Objects.MergeOption = MergeOption.NoTracking; 
var retrievedObj= context.Objects.Find(obj.Id);

As opposed to the AppendOnly Option

Will only append new (top level-unique) rows. This is the default behavior.

this is the default behavior you have been struggling with

查看更多
Luminary・发光体
4楼-- · 2019-04-03 19:01

Are you sure that you problem happens in all queries? In your example you have used the Find(), what if you use ToList()? The problem doesn't happen, right?

For testing purposes, try using Where method instead of Find(), I believe that you won't have problems...

If the above theory is true, replace the Find() by Where inside a some kind of repository class. Then you don't need to change anything else in your code.

For example, in your repository class:

public YourClass Find(id)
{
    //do not use Find here 
    return context.FirstOrDefault(i => i.Id == id); //or Where(i => i.Id == id).FirstOrDefault();
}

In your business logic:

var user = repository.Find(id);

The Find() method documentation https://msdn.microsoft.com/en-us/library/system.data.entity.dbset.find%28v=vs.113%29.aspx says:

"...if an entity with the given primary key values exists in the context, then it is returned immediately without making a request to the store..."

So, I believe that the problem is the Find(). Using a repository pattern, replacing Find by Where, is the easiest workaround that I can imagine right now. Or else, instead of replace, you can check if the softdelete is activated, and then choose your preferred method. What do you think about that?

A more difficult approach is creating a class that inherits from DbSet and override the Find(), which will be too complicated.

EDIT

To help us see what is happening, create a console application and log the database operation, like this:

using (var context = new BlogContext()) 
{ 
    context.Database.Log = Console.Write; 

    // Your code here... 
    // Call your query twice, with and without softdelete
}

Paste the log, then we'll see for sure if the sql is incorrect or the data is being cached.

EDIT 2

Ok... instead of adding the interceptor in the constructor of the configuration class, add it in the constructor of the context, like this:

//the dbcontext class      

 private IDbCommandTreeInterceptor softDeleteInterceptor;
 public DataContext()
       : base("YourConnection")
 {
    //add the interceptor 
    softDeleteInterceptor = new SoftDeleteInterceptor()           
      DbInterception.Add(softDeleteInterceptor);
 }

Then, inside your context class, create a method that removes the interceptor, like this:

public void DisableSoftDelete() 
{
     DbInterception.Remove(softDeleteInterceptor);
}

Call the method above when you want to disable the softdelete, context.DisableSoftDelete();

查看更多
我想做一个坏孩纸
5楼-- · 2019-04-03 19:03

To update softdelete, you can override SaveChanges method, and to create filter, you can use dbContext.Query<T>() which will apply soft delete filter automatically using expression generator.

To filter your soft delete column, you can implement following method, in your DbContext.

public IQueryable<T> Query<T>(){

   var ds = this.Set<T>() as IQueryable<T>;

   var entityType = typeof(T);

   if(!softDeleteSupported)
        return ds;

   ParameterExpression pe = Expression.Parameter(entityType);
   Expression compare = Expression.Equals( 
          Expression.Property(pe, "SoftDeleted"),
          Expression.Constant(false));

   Expression<Func<T,bool>> filter = 
       Expression.Lambda<Func<T,bool>>(compare,pe);

   return ds.Where(filter);
}
查看更多
登录 后发表回答