Postgresql 9.4 - FASTEST request to select and update on a large data array (> 30M rows) using heavy write / read and lock operations

I want to select one row from a large data set (> 30 million rows) with heavy write / read RANDOM.

My problem: I cannot let anyone choose postgresql (it would be the cheapest / fastest request using "limit 1") , because it behaves erratically and in an "obscure way": see my original problem here: postgresql 9.4 - prevent the application from always selecting the last updated rows

Here is my current request

UPDATE opportunities s
SET    opportunity_available = false
FROM  (
   SELECT id
   FROM   opportunities
   WHERE  deal_id = #{@deal.id}
   AND    opportunity_available
   AND    pg_try_advisory_xact_lock(id)
   LIMIT  1
   FOR    UPDATE
   ) sub
WHERE     s.id = sub.id
RETURNING s.prize_id, s.id;
// inspired by https://stackoverflow.com/questions/33128531/put-pg-try-advisory-xact-lock-in-a-nested-subquery

(postgresql 9.4 - ), , , , postgresql , ( "LIMIT 1", / ), RANDOM. , , postgresql , ( , ), .

, , RANDOM.

- ( "FOR UPDATE" " ", , , ... postgresql 9.5, , 9.5 -)

0

Source: https://habr.com/ru/post/1611637/


All Articles