Evening
I am going through a long process of importing data from a broken, 15-year read-only data format in MySQL to create several smaller statistics tables from it.
The largest table I built earlier was (I think) 32 million rows, but I did not expect it to get so big and really strain MySQL.
The table will look like this:
surname name year rel bco bplace rco rplace Jones David 1812 head Lond Soho Shop Shewsbury
So, small ints and varchars.
Can anyone offer tips on how to do this quickly? Will indexes on any of the coummns help, or will they just slow down queries.
Most of the data in each column will be duplicated many times. Some fields have no more than 100 different possible values.
The main columns to which I will refer to the table are: last name, first name, rco, rplace.
source share