MongoDB synchronizes development and production databases

We have a dev server that contains a collection of objects. The actual accumulation of these objects is a continuous process, on which the entire process of marking, verification, etc. is carried out. On this local dev server. As soon as these objects are ready for production, they are added to the production database, which from now on will use them in their calculations.

I am looking for a way to simply add delta (new objects) to the production database, keeping all other collections and old objects in the same collection that is. So far, we have used MySql, so this process simply involved executing the database structure and data synchronization (we used Navicat for this). Now we move on to MongoDB, so this process is a bit more complicated.

I studied this, and I think the following solutions do not fit my needs:

  • Dumping Dev DB and loading it into the production database using mongodump, then mongorestore
  • Running db.copyDatabase - Actually replaces the production db with a copy of Dev DB.

Both solutions are problematic because they actually replace the production database when all I want to do is update objects in an existing collection. In addition, the Dev => Production transition does not match the Master-Slave topology.

The best I could come up with was:

  • Copy the dev database to the "dev" database on the instance.
  • Copy the collection from this dev-DB into the actual production database and add the predicate that I will add only to objects that are similar to this solution that do not exist in the production database.

I was wondering if anyone has a better solution?

If not, does anyone have a script that could execute this?

+4
source share
3 answers

You can use the mongoexport tool to export a single collection from your development database. Use it in conjunction with the -query option, where you can express a predicate. For example, for example, ${ts : {$gt : previous clone time}} .

Then use mongoimport to import the delta file into the production database. Use --upsert and --upsertFields if you have two different logical documents with different _id values ​​but express the same document

+1
source

@Orid, thanks for your reply and sorry for the late reply.

After a lot of research and trial error, I decided to use the solution indicated in the question (copy the test database to the machine, and then copy the collections one by one). This is because the data that I use here is static data that has no real reason to have a timestamp. In addition, I decided to abandon the β€œupdate only” requirement, so for now I'm using mongorestore with --drop

I am doing all this using this script:

  • Shell script file:

rm -rf dump/;

 mongo copyTestDb.js; for COLLECTION in <Collections> do mongodump -d nutrino_copy -c $COLLECTION -o dump mongorestore -d nutrino -c ${COLLECTION} --drop dump/nutrino_copy/${COLLECTION}.bson done 
  • Js script file:

    db.copyDatabase("<dbName>","<dbName_Copy>","<testMachineUrl>")

Do you think I should use MongoImport instead of MongoRestore?

+1
source

Check out mongo-sync


This is a script I wrote for myself when I had to constantly copy my local MongoDB database to and from my production database for the project (I know, this is stupid).

After you put your data in the database in config.yml , you can start synchronization with two simple commands:

 ./mongo-sync push # Push DB to Remote ./mongo-sync pull # Pull DB to Local 

If you use it inside any project, it is recommended to add config.yml to .gitignore


mongo-sync demo gif

0
source

Source: https://habr.com/ru/post/1494773/


All Articles