I am inserting tens of thousands of objects into my Core Data entity. I have a single NSManagedObjectContext
and I am calling save()
on the managed object context every time I add an object. It works but while it is running, the memory keeps increasing from about 27M to 400M. And it stays at 400M even after the import is finished.
There are a number of SO questions about batch insert and everyone says to read Efficiently Importing Data, but it's in Objective-C and I am having trouble finding real examples in Swift that solve this problem.
There are a few things you should change:
NSPrivateQueueConcurrencyType
managed object context and do your inserts asynchronously in it.autoreleasepool
andreset
to empty the objects in memory after each batch insert and save.Here is how this might work:
Applying these principles kept my memory usage low and also made the mass insert faster.
Further reading
Update
The above answer is completely rewritten. Thanks to @Mundi and @MartinR in the comments for pointing out a mistake in my original answer. And thanks to @JodyHagins in this answer for helping me understand and solve the problem.