I recently started a new job where most of them were performed using stored procedures. We currently have a “deployment” folder in the original control that contains temporary folders with database modification scripts (in fact, we have 3 folders: one for table modification scripts, one for views and functions, and one for sprocs) , as well as the “Next” folder with the new changes that we are currently working on (it is renamed during deployment). We have three databases: a local copy on our workstation, available only to an individual developer, a development database, and a live database during production.
This means that we create and consolidate .SQL files, and then manually start them on a daily basis (since new ones are added to the initial control, so almost every time we do an update, we need to check what has changed in these folders and run them against our local copy of the database), not to mention the need to do the same on the dev and prod servers during deployment; we also have databases that are called differently on each server to avoid accidentally triggering a script change in the wrong environment (this seems strange to me, as you usually have a DB instance in different ways, but the actual database is the same on all your servers), so we cannot include statements USEin the script. This process seems very inefficient.
Are there any recommended best practice practices for this kind of thing that I could suggest starting instead?
source
share