Why is inserting entities in EF 4.1 so slow compared to ObjectContext?
C#Entity FrameworkEntity Framework-4.1C# Problem Overview
Basically, I insert 35000 objects within one transaction:
using(var uow = new MyContext()){
for(int i = 1; i < 35000; i++) {
var o = new MyObject()...;
uow.MySet.Add(o);
}
uow.SaveChanges();
}
This takes forever!
If I use the underlying ObjectContex
t (by using IObjectAdapter
), it's still slow but takes around 20s. It looks like DbSet<>
is doing some linear searches, which takes square amount of time...
Anyone else seeing this problem?
C# Solutions
Solution 1 - C#
As already indicated by Ladislav in the comment, you need to disable automatic change detection to improve performance:
context.Configuration.AutoDetectChangesEnabled = false;
This change detection is enabled by default in the DbContext
API.
The reason why DbContext
behaves so different from the ObjectContext
API is that many more functions of the DbContext
API will call DetectChanges
internally than functions of the ObjectContext
API when automatic change detection is enabled.
Here you can find a list of those functions which call DetectChanges
by default. They are:
- The
Add
,Attach
,Find
,Local
, orRemove
members onDbSet
- The
GetValidationErrors
,Entry
, orSaveChanges
members onDbContext
- The
Entries
method onDbChangeTracker
Especially Add
calls DetectChanges
which is responsible for the poor performance you experienced.
I contrast to this the ObjectContext
API calls DetectChanges
only automatically in SaveChanges
but not in AddObject
and the other corresponding methods mentioned above. That's the reason why the default performance of ObjectContext
is faster.
Why did they introduce this default automatic change detection in DbContext
in so many functions? I am not sure, but it seems that disabling it and calling DetectChanges
manually at the proper points is considered as advanced and can easily introduce subtle bugs into your application so use [it] with care.
Solution 2 - C#
Little empiric test with EF 4.3 CodeFirst:
Removed 1000 objects with AutoDetectChanges = true : 23 sec
Removed 1000 objects with AutoDetectChanges = false: 11 sec
Inserted 1000 objects with AutoDetectChanges = true : 21 sec
Inserted 1000 objects with AutoDetectChanges = false : 13 sec
Solution 3 - C#
In .netcore 2.0 this was moved to:
context.ChangeTracker.AutoDetectChangesEnabled = false;
Solution 4 - C#
Besides the answers you have found here. It is important to know that at the database level is is more work to insert than it is to add. The database has to extend/allocate new space. Then it has to update at least the primary key index. Although indexes may also be updated when updating, it is a lot less common. If there are any foreign keys it has to read those indexes as well to make sure referential integrity is maintained. Triggers can also play a role although those can affect updates the same way.
All that database work makes sense in daily insert activity originated by user entries. But if you are just uploading an existing database, or have a process that generates a lot of inserts. You may want to look at ways of speeding that up, by postponing it to the end. Normally disabling indexes while inserting is a common way. There is very complex optimizations that can be done depending on the case, they can be a bit overwhelming.
Just know that in general insert will take longer than updates.