I would like to use db4o as the basis for implementing a custom cache. Typically, my program involves loading into memory about 40 million objects and working on them simultaneously. Obviously, this requires a lot of memory, and I thought that maybe saving some objects (those that are not in the cache) to the db4o database. My preliminary tests show that db4o will be slightly slower than I would like (about 1,000,000 objects took 17 minutes to persist). However, I used the most basic setting.
I was doing something like this:
using (var reader = new FileUnitReader(Settings, Dictionary, m_fileNameResolver, ObjectFactory.Resolve<DataValueConverter>(), ObjectFactory.Resolve<UnitFactory>()))
using (var db = Db4oEmbedded.OpenFile(Db4oEmbedded.NewConfiguration(), path))
{
var timer = new Stopwatch();
timer.Start();
IUnit unit = reader.GetNextUnit();
while (unit != null)
{
db.Store(unit);
unit = reader.GetNextUnit();
}
timer.Stop()
db.Close();
var elapsed = timer.Elapsed;
}
Can anyone offer tips on how to improve performance in this scenario?