Work with large collections in db40 (.net)

I would like to use db4o as the basis for implementing a custom cache. Typically, my program involves loading into memory about 40 million objects and working on them simultaneously. Obviously, this requires a lot of memory, and I thought that maybe saving some objects (those that are not in the cache) to the db4o database. My preliminary tests show that db4o will be slightly slower than I would like (about 1,000,000 objects took 17 minutes to persist). However, I used the most basic setting.

I was doing something like this:

using (var reader = new FileUnitReader(Settings, Dictionary, m_fileNameResolver, ObjectFactory.Resolve<DataValueConverter>(), ObjectFactory.Resolve<UnitFactory>()))
using (var db = Db4oEmbedded.OpenFile(Db4oEmbedded.NewConfiguration(), path))
{
    var timer = new Stopwatch();
    timer.Start();
    IUnit unit = reader.GetNextUnit();
    while (unit != null)
    {
        db.Store(unit);
        unit = reader.GetNextUnit();
    }
    timer.Stop()
    db.Close();

    var elapsed = timer.Elapsed;
}

Can anyone offer tips on how to improve performance in this scenario?

+3
2

, , .

, . . , FastReflector . . :

var config = Db4oEmbedded.NewConfiguration();
config.Common.ReflectWith(new FastNetReflector());

using(var container = Db4oEmbedded.OpenFile(config, fileName))
{
}

"", , :

var config = Db4oEmbedded.NewConfiguration();
config.File.Storage = new CachingStorage(new FileStorage(), 128, 1024 * 4);

: db4o . 1 000 000 , . . , 100 000 . , , .

+2

, :

, .Ext() OpenFile().

.

using (var db = Db4oEmbedded.OpenFile(Db4oEmbedded.NewConfiguration(), path).Ext())
// ....
db.Store(unit);
db.Purge(unit);
// ....

, db4o .

, , ( db4o.) 8.0 , , .

8.0 , Gamlor , :

config.File.Storage = new CachingStorage(new FileStorage(), 128, 1024 * 4);

, :

config.File.Storage = new CachingStorage(new FileStorage(), 1280, 1024 * 40);
+1

Source: https://habr.com/ru/post/1754095/


All Articles