Gigantic production database migration without downtime

We host a Rails application on AWS that uses mysql in the amazon-rds database. We have a giant table in the database that we want to migrate, but the migration takes several days, due to millions of rows. In particular, we move VARCHAR to the TEXT column

How can I migrate this large production database if users are not idle?

One of the ideas I heard is to set up a copy of the database and migrate there and switch it to the main database when it is done. However, I am not sure how this takes into account user data entered during the migration.

Update: This may be relevant: amazon-rds suggests reading replicas and Multi-AZ deployments that seem to be done for this type of thing. However, for the first time, having done this, we will welcome guidance on any method, whether this or that.

+4
source share
1 answer

I do not know if this is possible using hosting options, but I would solve the problem as follows:

  • Copy the data to the new server (and write the position of the binary log to the old database server)
  • Enable replication from the old server to the new server (and wait for it to catch up)
  • Configure the Rails application to point to the new server

Read-only mode is another potential option if “recording downtime” is more acceptable than “full downtime” (depending on the application).

+1
source

Source: https://habr.com/ru/post/1498514/


All Articles