I use KiokuDBto save a couple of Moose objects and a couple of simple array structures (hashes and arrays).
I don't need any fancy searches, transactions, etc., just the ability to get ( lookup) an object. In addition, once I have finished creating the database, it can be installed read-only. No changes will ever be made.
The main (only?) Reason I use KiokuDB is to save the object graph.
The largest object that dominates the total size of the database is the Moose object, which has a relatively large array in it (let me call this object large_obj). I used to save large_obj(one) using Storable + PerlIO::gzipor even JSON + PerlIO::gzip. It worked great, and I was very pleased with the results (using gzip compressed the storage file to about 5% of its original size).
There is another smaller Moose object, which is basically an array of 20-30k small Moose objects.
Now, after switching to KiokuDB, I first used a simple hash server and then dumped it to a file again (using Cmd) with PerlIO::gzip. This worked very well when it large_objwas relatively small, but as soon as it got bigger, I just made a mistake with the memory errors. I believe the hash supported is not suitable for large objects.
Then I tried the recommended Berkeley backend, although it seems like it is a buster (as already mentioned, I really don't need all the features of Fancy DB). It works much slower than the original Storable + solution PerlIO::gzip, it takes up much more space, and there is not enough memory for large objects! (I use 3GB RAM ubuntu).
I also tried the file server , but it fails:
Too many open files at /usr/local/perls/perl-5.12.2/lib/site_perl/5.12.2/Directory/Transactional.pm line 130.
(in cleanup) Too many open files at /usr/local/perls/perl-5.12.2/lib/site_perl/5.12.2/Directory/Transactional.pm line 130.
- , , ?