I understand that LIKE queries are slow since they cannot be indexed. However, I'm interested in learning about performance in this situation:
Let's say I have a table like:
user_id | message ------------------- 1 | foo bar baz 1 | bar buz qux . . . . . . 2 | bux bar foo 2 | bar
where I say 1 million lines, but 10,000 users, so each user has about 100 posts.
Obviously, the search is as follows:
SELECT * FROM table WHERE message like '%ar%';
will be very slow. However, in my application, I would always look for user posts:
SELECT * FROM table WHERE message like '%ar%' AND user_id = 2;
where the user_id column will be indexed.
I correctly understand that in a similar scenario, Postgres will only execute a slow LIKE request for users ~ 100 rows after using the indexed user_id column rather than the full table, which will limit my performance?
And also, that such a request will not be much slower with 10 or 100 million users, if any user has only ~ 100 messages?
source share