boss wants to get serial numbers for every merchant starting at 1000.
Right now I am browsing every merchant (using ruby) and updating such orders:
add_column :orders, :order_seq, :integer
Merchant.find_each do |merchant|
order_seq = 999
merchant.orders.order(:ordered_at).find_each do |order|
order.update_column(:order_seq, order_seq+=1)
end
end
I planned to run this during the migration to set all existing orders to consecutive numbers filled in according to their order_at date. I tested this on the fork of a production database and it will take an average of 80 ms to update an order. With nearly millions of order records, this will lead to too much downtime.
Is there a faster way to do this using native postgres? It will be a one-time migration that needs to be run once, and there is nothing else that happens at the same time.
, , 999 + row_number() merchant_id row_number order_seq?
:
@Gorden-Linoff , . , merchant_id, , . , , , where id, merchant_id ordered_at.
:
Merchant.active.find_each(batch_size: 100) do |merchant|
statement = "update orders set order_seq = o.seqnum + 999 " +
"from (select o.id, row_number() " +
" over (order by ordered_at) as seqnum from orders o where o.merchant_id = #{merchant.id}" +
") o where orders.id = o.id"
ActiveRecord::Base.connection.execute(statement)
end
10 200 . 10 1 .