Sooner or later, every developer using Entity Framework (EF Core or EF6) faces the same question:
How can I insert my entities faster?
In EF6, saving data was painfully slow due to ChangeTracker and how entities was saved. With EF Core, performance improved — especially on SQL Server — but inserts can still become a bottleneck.
The good news? There are faster ways.
In this article, we’ll look at the 4 best solutions for bulk inserts in EF:
SqlBulkCopy, MySqlBulkCopy, and othersIf you are using EF Core, make sure to also read the EF6 section. Many misconceptions about EF Core performance come from how things worked in the old EF6 days.
EF6 had no built-in support for true bulk inserts. All inserts were executed one row at a time, making large saves painfully slow.
The only area developers could really optimize was the ChangeTracker — and that was THE major problem.
As more entities were tracked, performance dropped drastically — even before calling SaveChanges().
How bad could the ChangeTracker get? Some developers reported that inserting thousands of entities took several hours, and the worst part was that nothing had been saved yet.
That’s why most of the common “performance tricks” for EF6 focused on reducing the overhead caused by the ChangeTracker.
Because of this limitation, it was the perfect time to use third-party libraries such as Entity Framework Extensions, which quickly became the go-to solution for inserting large numbers of entities efficiently.
DetectChangesIn EF6, the real performance killer was the ChangeTracker.
Entities weren’t even saved yet, and it could already take minutes or even hours just to add them into the tracker.
The main culprit was DetectChanges().
By default, it was called for every entity added with the Add method.
Each time it ran, it didn’t just check the new entity — it re-scanned all entities already being tracked.
And the worst part? Performance got even worse when your entity had relationships, even if those relationships weren’t populated.
Take the following benchmark as an example. It measures how many milliseconds it takes to add one entity to the ChangeTracker, depending on how many items are already tracked (the EntityCount):
![]()
As shown in the benchmark results, with 1 million entities already in the ChangeTracker:
Yes, you read that right — it took a full second just to add a single entity to the ChangeTracker. And remember, we haven’t even saved it yet!
At that speed, you could only add 3,600 entities per hour in the ChangeTracker — That’s how easily the process could stretch into many hours or even days in real-world scenarios.
Adding entities with graphs in EF6 was nothing short of a nightmare, especially if you were using the Add method incorrectly.
👉 That’s why every EF6 workaround focused on one thing: reducing the overhead of DetectChanges() in the ChangeTracker.
AddRange Instead of AddWhen adding many entities in EF6, using Add inside a loop was extremely inefficient.
The reason is simple:
Add triggers DetectChanges(), which scans all entities in the ChangeTracker.DetectChanges() runs 10,000 times — each time on the entire tracked list.In the source code:
Add method calls ActOnSet, which in turn calls DetectChanges() before adding the entity.AddRange method calls DetectChanges() only once before adding all entities in a single call to ActOnSet, which this time does not call DetectChanges() again.Using the following benchmark, the difference is dramatic:
![]()
Benchmark results:
Add method starts slowing down.Add gets clearly slower, while AddRange scales smoothly.Add becomes 83× slower and uses 325× more memory.Add becomes completely unusable — 1,946× slower and 3,275× more memory.This small change avoids thousands of unnecessary checks and makes adding entities dozens to hundreds of times faster, especially with large collections.
// Slower: DetectChanges runs for every entity
foreach (var customer in customers)
{
context.Customers.Add(customer);
}
// Faster: DetectChanges runs only once
context.Customers.AddRange(customers);
AutoDetectChangesEnabledBy now, you already know that the main performance problem in EF6 was how many times the DetectChanges() method was called.
Another effective way to improve performance is to temporarily disable automatic change detection, add all your entities, and then re-enable it before saving.
using (var context = new EntityContext())
{
context.Configuration.AutoDetectChangesEnabled = false;
var customers = new List<Customer>();
for (int i = 0; i < 2000; i++)
{
customers.Add(new Customer { /* ...set properties... */ });
}
context.Configuration.AutoDetectChangesEnabled = true;
context.SaveChanges();
}
As shown in this benchmark, the performance difference between Add and AddRange becomes minimal when AutoDetectChangesEnabled is temporarily turned off:
![]()
You’ll often see people suggesting to never re-enable AutoDetectChangesEnabled or to disable ValidateOnSaveEnabled as well.
Sure — that’s faster, but it can also lead to problems:
You do want your entities validated, right?
Calling DetectChanges() once isn’t a problem, even with a million entities.
Remember — the real issue was that it used to run thousands of times, once per added entity.
Another way to improve performance in EF6 is to save entities in smaller batches instead of all at once.
The idea is simple:
ChangeTracker, the slower it becomes.Here’s an example of saving entities in chunks of 1,000:
const int batchSize = 1000;
for (int i = 0; i < customers.Count; i += batchSize)
{
using (var context = new EntityContext())
{
var batch = customers.Skip(i).Take(batchSize).ToList();
context.Customers.AddRange(batch);
context.SaveChanges();
}
}
Using the following benchmark:
![]()
We can see that:
Add or AddRange methods correctly.Add method remains the slowest, even when batching, but since each context only tracks a limited number of entities, the slowdown is much less noticeable.In short, batching can help if you’re forced to use the Add method incorrectly, but it doesn’t provide any additional benefit over the two previous solutions.
Why this helps
ChangeTracker light and avoids the slowdown caused by massive collections.In EF6, performance issues were mostly caused by how entities were added to the ChangeTracker, since there were no built-in bulk operations — unless you used a third-party library such as Entity Framework Extensions.
With EF Core, things are different. The performance issues related to the ChangeTracker have been fixed, and overall performance has greatly improved.
However, when you search for ways to improve performance even further, you’ll often find misleading advice online about “bulk inserts”.
Let’s clear things up.
AddRange or AddRangeAsync performs a Bulk InsertDoes calling AddRange mean EF Core will do a real bulk insert?
No. Using Add or AddRange has no impact on how entities are saved. The AddRange method has only one job — to add entities to the ChangeTracker, not to decide how they will be saved in the database.
Even more, unlike EF6, it doesn’t matter whether you use Add or AddRange for performance. Why?
Because in EF Core, DetectChanges() is executed only once during SaveChanges if AutoDetectChangesEnabled is set to true.
Using the following benchmark:
![]()
We can clearly see that there is no difference in performance whether you use the Add or AddRange method.
So in short:
AddRange does not perform a Bulk Insert — it only adds entities to the ChangeTrackerAddRange is not faster than Add in EF CoreSaveChanges already does a Bulk InsertYes and no. There have been major improvements since EF6, but saying SaveChanges performs a real bulk insert isn’t quite accurate.
Here’s what actually happens, depending on the provider:
MERGE statement to insert multiple rows at once, a similar strategy that Entity Framework Extensions introduced back in 2014.
SqlBulkCopy under the hood, which is much faster for large datasets.The truth:
SaveChanges makes a database round-trip for every entityThis is another myth carried over from EF6. In EF6, inserts were indeed sent one by one, meaning every entity caused a separate round-trip to the database.
But in EF Core, that’s no longer true — as we just saw in Lie #2.
MERGE to insert multiple rows at once.So no — SaveChanges does not make a database round-trip for every entity (except for SQLite).
That misconception belongs to EF6, not EF Core.
If you want full control and maximum performance, most database providers expose their own bulk insert APIs. These APIs bypass Entity Framework completely and write data directly to the database.
Pros
Cons
Examples
SqlBulkCopy (docs)BinaryCopy (docs)MySqlBulkCopy (docs)OracleBulkCopy (docs)Array Binding (docs)Here’s a simple example for SQL Server:
using (var bulk = new SqlBulkCopy(connection))
{
bulk.DestinationTableName = "Customers";
bulk.WriteToServer(dataTable);
}
Using these APIs directly works great for basic scenarios, but they quickly become limited when you need to handle more advanced cases such as:
That’s why most developers prefer to use a professional library like Entity Framework Extensions, which takes care of all these details and is already well-tested with thousands of unit tests.
For EF6, there are currently no free third-party libraries that we can recommend.
A few exist, but they haven’t been updated since 2019 and are no longer supported:
Using an unsupported library is not something we would recommend in a production environment — you can try them at your own risk, but expect limitations and potential issues.
For EF Core, the only free option worth mentioning at this moment is EFCore.BulkExtensions.MIT — a community fork of the original library, released under the MIT license.
It’s completely free for both personal and commercial use, supports EF Core 6 to 9, and includes a few extra bug fixes not merged into the main branch.
You can view benchmarks and details here: 👉 Learn EF Core - EFCore.BulkExtensions.MIT
⚠️ Important: Don’t confuse it with EFCore.BulkExtensions, which now requires a paid license for most companies.
While the MIT fork works well for basic bulk operations, it’s not actively maintained, offers limited community support, and has partial compatibility outside of SQL Server.
This is the library that most developers recommend when it comes to bulk operations in Entity Framework.
It supports both EF6 and EF Core, and is the most complete bulk extension library on the market. Created in 2014, it is now trusted by 5,000+ customers worldwide.
The library includes everything you need for high-performance operations:
All operations are also highly customizable, giving you full control over how your data is handled.
Advantages
SqlBulkCopy).We do not recommend using EFCore.BulkExtensions anymore.
Since moving to a dual license in 2023, the library has lost most of its appeal. Commercial use is restricted to companies earning under $1M per year, and the license is subscription-based, requiring annual renewals.
Beyond licensing, the project is no longer actively maintained — most GitHub issues go unanswered, and there have been no meaningful updates for a long time. See for yourself — many open issues have been left without any reply.
You can read more details and benchmarks on Learn EF Core - EFCore.BulkExtensions.
If you’re considering a commercial library, the difference is even clearer:
| Feature | EFCore.BulkExtensions | Entity Framework Extensions (EFE) |
|---|---|---|
| License | Subscription (renew yearly) | Perpetual (renew optional) |
| Support | Minimal, mostly unanswered | Active, responsive, professional |
| Updates | Rare and irregular | Monthly, stable, and tested |
| Features | Basic bulk operations only | Full feature set with advanced mapping, identity handling, and support across all major database providers |
| Performance | Good in simple cases, slower in complex ones | Optimized for all scenarios |
In short:
EFCore.BulkExtensions is now the outdated choice:
At the end of the day, bulk inserting in Entity Framework comes down to two things: performance and simplicity.
You can try to optimize SaveChanges(), use provider APIs, or test open-source forks — but each of those paths comes with trade-offs: more code, more maintenance, and more risk.
If your goal is to insert millions of rows quickly, without rewriting your logic or managing database-specific quirks, there’s really one tool built for that job — Entity Framework Extensions.
It’s not just faster — it’s smarter:
SqlBulkCopy under the hoodYou keep your existing code — just add “Bulk”. That’s all it takes to go from waiting minutes to finishing in seconds.
So if you care about speed, reliability, and simplicity, choose Entity Framework Extensions — and stop wasting time waiting on inserts.