I have a 200GB / 400Mrows mysql / innodb database - far beyond reason as I found out.
One of the amazing problems is restoring backups. mysqldump generates huge sql files and it takes them about a week to import back to the new database (trying to do it faster, like larger / smaller transactions, disconnecting keys during import, etc., compressing the network, etc. to so far, importing myisam seems 2x faster, but then there will be no transactions).
To make matters worse - and I hope to get some help with this - a network connection that transfers> 200 GB over a specific period of time of the week has a nontrivial chance of hacking, and the sql import process cannot be continued in any non-trivial way.
What would be the best way to handle this? Right now, if I notice a damaged connection, I will manually try to find out when it finished by checking the highest primary key of the last imported table, and then we get perlscript, which basically does this:
perl -nle 'BEGIN{open F, "prelude.txt"; @a=<F>; print @a; close F;}; print if $x; $x++ if /INSERT.*last-table-name.*highest-primary-key/'
This really is not the way, so what would be the best way?
source
share