Loads the result of a MySQLdb database query into memory

Our application selects the correct database server from the database server pool. Thus, each request is 2 requests, and they look like this:

  • Get the correct database server
  • Run request

We do this so that we can take database servers online and offline, if necessary, as well as for load balancing.

But the first query seems that it can be cached into memory, so it actually queries the database every 5 or 10 minutes or so.

What is the best way to do this?

Thank.

EDIT this for the Pylons web application

+3
source share
2 answers

(python dict), , N , , .

import time

cache = {}
lastTime = time.time()

def timedCacheDecorator(func):

    def wrap(*args, **kwargs):

        key = str(args)+str(kwargs)

        # cache for 5 seconds
        global lastTime
        if key not in cache or time.time() - lastTime > 5:
            lastTime = time.time()
            cache[key] = func(*args, **kwargs)

        return cache[key]

    return wrap


# lets test it

@timedCacheDecorator
def myquery():
    return time.time()

print myquery()
time.sleep(1)
print myquery()
time.sleep(5)
print myquery()
time.sleep(1)
print myquery()

:

1270441034.58
1270441034.58
1270441040.58
1270441040.58

, , , , , .

+4

- -, , pylons.

-:

from pylons.decorators.cache import beaker_cache

, , :

@beaker_cache(expire = 300, type='memory')

expire , ( 5 ).

+1

Source: https://habr.com/ru/post/1739811/


All Articles