How can I reduce Mongo db by averaging old data

I have mongodb for measurements that have a document for measurements. Each document is as follows:

{
 timestamp : 123
 value     : 123
 meta1     : something
 meta2     : something
}

I get measurements from several sources every second, and so the db gets pretty big, fast. I am interested in storing information at the frequency at which it was read, but older data, I would like to periodically clean out the information in order to save space and make the bit a little faster.

1. What is the best approach in mango?

2.There is the best db there for this, given that the circuit is different for different measurements, and the fixed format will not work very well. RRD is also not an option, as I need dynamic query capabilities.

+3
source share
3 answers

1. What is the best approach in mongo?
Use private collections for cases such as journaling. Another approach is to create a “background process” that will move the old data from the collection.

2.There is the best db there for this, given that the circuit is different for different measurements, and the fixed format will not work very well. RRD is also not an option, as I need dynamic query capabilities.
 Mondomb fits well here.

Update: Another question is to store each data item twice: first in a private collection (and use this collection for query). And create another collection (or even another logdb) just to log your events.

+4

.

, . , 3 , 1, 1 , 15 , , , / , ..

+3
  • , cron, (your_time = now - some_time).

    db.docs.remove({ timestamp : {'$lte' : your_time}})

  • , schemaless, , , mondogb .

+2

Source: https://habr.com/ru/post/1794669/


All Articles