How does caching affect memory consumption?

I have an application that has a search function. This function scans a search query in a giant object (dictionary), which I cache for 24 hours. The object is about 50,000 keys and weighs about 10 MB.

When I look at the memory usage on my hosting, I notice that after a few requests, the memory usage goes from 50 MB to more than 450 MB, forcing my hosting provider to kill the application.

So I wonder what is going on here. In particular, how does the cache use memory for each request, and what can I do to fix this?

+6
source share
2 answers

Django FileBasedCache is known for having performance issues. You can get the big picture at the following links:

Smarter file archive for Django

Error: cache cache is not very effective with a large number of cached files

Bug fixed:

I am going to use wontfix on the grounds that the file system cache is intended as an easy way to check caching, and not as a serious caching strategy. The default cache size and rejection strategy implemented by the file cache should make this obvious.

+1
source

Consider using KVS, such as Memcache or Redis, as a caching strategy, as both of them support expiration. Also, consider a highlighted search, such as ElasticSearch, if more expected features will be associated with the search.

Howtos tools available:

Install memcached for django project

http://code.google.com/p/memcached/wiki/NewStart

http://redis.io/commands/expire

https://github.com/bartTC/django-memcache-status

http://www.elasticsearch.org/guide/reference/index-modules/cache.html

0
source

Source: https://habr.com/ru/post/916166/


All Articles