I have a task to import about 1 million orders. I iterate over the data to update it to the values in the new database and it works fine on my local machine with 8 gigfor ram.
However, when I load it into my AWS instance, t2.mediumit will work for the first 500 thousand lines, but towards the end, I will start maximizing my memory when it starts to actually create non-existent orders. I am moving the database mysqltopostgres
Am I missing something obvious here?
require 'mysql2'
require 'active_record'
def legacy_database
@client ||= Mysql2::Client.new(Rails.configuration.database_configuration['legacy_production'])
end
desc "import legacy orders"
task orders: :environment do
orders = legacy_database.query("SELECT * FROM oc_order")
progressbar = ProgressBar.create(:total => orders.count, :format => "%E, \e[0;34m%t: |%B|\e[0m")
orders.each do |order|
if [1, 2, 13, 14].include? order['order_status_id']
payment_method = "wx"
if order['paid_by'] == "Alipay"
payment_method = "ap"
elsif order['paid_by'] == "UnionPay"
payment_method = "up"
end
user_id = User.where(import_id: order['customer_id']).first
if user_id
user_id = user_id.id
end
order = Order.create(
import_id: order['order_id'],
user_id: user_id,
receiver_name: order['payment_firstname'],
receiver_address: order['payment_address_1'],
created_at: order['date_added'],
updated_at: order['date_modified'],
paid_by: payment_method,
order_num: order['order_id']
)
progressbar.increment
end
end
end
source
share