Does SQL take into account many-to-many values ​​or is it counted every time a new line is added?

I am using MySQL (MyISAM) 5.0.41 and I have this query:

SELECT `x`.`items`.id, `x`.`items`.name, COUNT(*) AS count
    FROM `x`.`items` INNER JOIN `x`.`user_items`
    ON `x`.`items`.id = `x`.`user_items`.item_id
    GROUP BY name HAVING count > 2 ORDER BY count DESC

I have about 36,000 users, 175,000 user_items and 60,000 items that are constantly being added. So this query is slowing down a bit ...

Is it better:

  • Enter countin the field itemsand update periodically (say, every time the user adds an item)
  • or run a query like this (slow).

Or is there any SQL that populates the count field for me?

thank

+3
source share
5 answers

user_items.item_id . ( ), . , GROUP BY, , .

+2

:

  • ts DATETIME user_items, ,

  • ts DATETIME users, , cnt,

  • users :

    INSERT
    INTO    users (id, ts, cnt)
    SELECT  *
    FROM    (
            SELECT  user_id, NOW() AS nts, COUNT(*) AS ncnt
            FROM    user_items ui
            WHERE   ui.timestamp <= NOW()
            )
    ON DUPLICATE KEY
    UPDATE  ts = nnow,
            cnt = ncnt
    
  • user_items

  • :

    SELECT  u.id, u.cnt +
            (
            SELECT  COUNT(*)
            FROM    user_items ui
            WHERE   ui.ts > u.ts
                    AND ui.user_id = u.id
            )
    FROM    users
    

, user_items, , concurrency .

+3

. . , .

, "count" "items", . . ( ) , , .

+1

- ( , "" ), .

, , , , , . . Postgresql. (.. , ) - .

0

Do you really get all 36,000 users each time you fulfill your request? If you're looking for a source of performance issues, then that might be there.

Depending on your RDBMS, you might see things like indexed or materialized views. Turning the counter on as part of the table and trying to save it will almost certainly be a mistake, especially with the small size of your database.

0
source

Source: https://habr.com/ru/post/1730230/


All Articles