Automated database deployments with user-generated content (a la CMSes)

My team has been getting into the CMS more and more over the past couple of years. We are becoming more and more in continuous integration. Matching the two turned out to be difficult. To make this even worse, we create LAMP and .NET sites, so our scripts are perfect for both.

We have four environments for each local site, integration, production and production. Content content and file uploads occur regularly at the production site. Evolution obviously begins locally and is in full swing.

What are some of the methods or methods that I can implement on my build server to automatically transfer data and schema updates from the development environment to production without overwriting user-generated content? And vice versa, how (and when) can I automatically output user-created data to the development environment?

+6
source share
2 answers

You will have 3 kinds of things in your database that you need to worry about.

1) A schema that can be defined in DDL. 2) Static or search data that can be defined in DML. 3) Dynamic (or user) data that can also be defined in DML.

Typically, the changes in (1) and (2) should simply be suitable for production with code on which they are mutually dependent. (3) it should never increase, but can be copied to the development environment if they are synchronized with the production environments of that time.

Of course, this is much more complicated. To get (1) up, you may need to convert exsiting schema / DDL to specific alter statements, and it may also require data processing if the data types or locations change. To get (2) up, synchronization with code assembly is required, which can be complicated in complex environments.

There are many tools, and if you need automation, you probably need advice from someone familiar with them.

I use a very simple schema, where all changes to the schema are reflected in the SQL build script, and also added to the SQL script changes, which include all the SQL needed to perform any necessary transformations. This works well for me, but my script is very simple (1 person, 1 server) and therefore is atypical.

However, the key to success is identifying the work required for change at the time the change is made. The naive way to simply change the development database and then perform the fix during deployment is a disaster.

+1
source

For A) it is best to use a priority system. If your content is "default" and user-generated content "overrides" it. And this content is in two different places. Thus, you update your materials without any problems with any merging process. Your data can be in another directory or marked in the database in different ways, whatever. The disadvantage of this is that you cannot just use the basic functions of a mailbox server to upload data to the server, unless you perform the actual mapping when creating the links.

i.e. you can either intercept /site/images/icon.png or serve either /site/images/default/icon.png or / site / images / client / icon.png or when you write pages you can check there and send the correct URLs so that the server can serve them directly.

In the second case, when you can use customer data, this is more of an IP legal problem with which you need to talk to the client about whether they are a) willing and b) even able to share theirs with you. After that, the technology is simple.

0
source

Source: https://habr.com/ru/post/896743/


All Articles