A very large database, a very small part that is most retrieved in real time

I have an interesting database problem. I have a 150 GB database. My memory buffer is 8 GB.

Most of my data is rarely retrieved or mostly retrieved by backend processes. I would prefer to keep them, because some functions require them.

Some of them (namely, some tables and some identifiable parts of some tables) are very often used in the user's appeal

How can I make sure that the latter is always stored in memory? (there is more than enough space for them)

Additional Information: We are on Ruby on rails. The database is MYSQL, our tables are stored using INNODB. We collect data on 2 sections. Since we fine it, we save most of our data with JSON drops, and index only the primary keys

Update 2 The difficulty is that the data is actually used for both backend processes and user-defined functions. But they are addressed much less frequently over the past

Update 3 Some people comment on what 8Gb is a toy these days. I agree, but just increasing the db size is pure LAZINESS, if there is a more reasonable, effective solution

+3
5

MySQL Query Cache . MySQL (, -) SQL_NO_CACHE.

- , . , . , ( , , , )

UPDATE:

2:

MySQL, .. innodb.

0

. : (a) (b) .

  • , , , .

  • , , , .

150Gb , .

"" ETL-, , .

+3

, , , , . , , . .

+1

memcached! -, ActiveRecord. ngmoco , , , , .

, $cache.set/get/expire .

+1

, ?

-, 150 . 10 .

-, . ( ), . , , , ( , , 8 - 2 ).

, . , - , mysql , .

0
source

Source: https://habr.com/ru/post/1746432/


All Articles