I'm just working on an Android Project where an ORM greendao is used. It allows me to insert a number of entities (which are obviously objects) into the database at once (in one transaction). In practice there's a method
insertOrReplaceInTx(...)
which takes as a parameter a collection, given a List of objects. The problem is that my list has sort of 12000 objects and an insert leads from time to time to an OutOfMemory Exception. I'm thinking of a smart way to solve the problem and prevent future OOM. The only idea which comes to my mind is to split the huge collection into sort of subcollections (let's say 500 elements each) and commit in a loop making several small commits instead of one huge. The bad thing about it is that it takes longer to insert the records and time matters here. In the end I'm not sure if making commit after commit doesn't kill the heap anyway. Maybe call a sleep() method in between to let GC clear the heap ...but it looks to me as a kind of ugly solution.
any smart ideas on your side?
thanks in advance
I am working on a project using greendao 1.3.1. Some of the tables containing about 200000 entities (not containing a lot of properties).
I read the entities from csv and to speed things up I developed a small solution, which might also help with your OOM-issue.
For explanation:
greendao uses a cache and after each insert it updates the entity to get the row-id and probably inserts the entity into its cache. On top of that greendao starts a transaction if you call an insert or an update method and if there isn't already a transaction. This slows down "bulk"-inserts and increases the memory usage and also reduces speed.
What I did:
Performance (time)
To fasten things up I started a transaction before I did any insert. This way greendao will not start a transaction on every insert and all inserts and updates are in the same transaction which has additional benefits concerning data consistency.
You can use code like this:
SQLiteDatabase db = dao.getDatabase();
db.beginTransaction();
try {
// do all your inserts and so on here.
db.setTransactionSuccessful();
} catch (Exception ex) {
} finally {
db.endTransaction();
}
But this won't help you with your OOM-problem yet.
Memory-usage
Solution 1
If you don't want to mess with the greendao-code you can issue a DaoSession.clear()
every once in a while.
This is definitely the simple solution, but will be less performant than solution 2.
Solution 2
To prevent greendao from updateing and inserting the entity into its cache you can replace the method private long executeInsert(T entity, SQLiteStatement stmt)
with this code in AbstractDao.java:
/**
* Insert an entity into the table associated with a concrete DAO.
*
* @return row ID of newly inserted entity
*/
public long insertOrReplace(T entity, boolean update) {
return executeInsert(entity, statements.getInsertOrReplaceStatement(), update);
}
private long executeInsert(T entity, SQLiteStatement stmt) {
return executeInsert(entity, stmt, true);
}
private long executeInsert(T entity, SQLiteStatement stmt, boolean update) {
long rowId;
if (db.isDbLockedByCurrentThread()) {
synchronized (stmt) {
bindValues(stmt, entity);
rowId = stmt.executeInsert();
}
} else {
// Do TX to acquire a connection before locking the stmt to avoid deadlocks
db.beginTransaction();
try {
synchronized (stmt) {
bindValues(stmt, entity);
rowId = stmt.executeInsert();
}
db.setTransactionSuccessful();
} finally {
db.endTransaction();
}
}
if (update) {
updateKeyAfterInsertAndAttach(entity, rowId, true);
}
return rowId;
}