I once had the same problem. What I did is a set of JS files that I have to run manually on each env ( mongo can evaluate scripts). These files had data inside, so I had to carefully monitor them so as not to introduce duplicate / corrupted existing data.
But then I found the migration tools very useful for such tasks. This one is my favorite, but there are many other solutions .
Think about your changes in relation to the flow of immutable events. For example: to delete a previously created index, you create a new change (migration) that deletes it. You save this stream as a set of files directly in your VCS, so the setup for the new environment is quite simple ( mm migrate and your database is updated!) For new people in the project. In addition, this approach is very useful when working with containers or virtual machines: usually you can run a script based on the container / virtual machine life cycle and thus automatically populate the database, making reboots / carefree painless.
source share