Python decorator parameter using a variable from its wrapping function

I am looking for a way to create a decorator to have a function parameter that actually uses the variable passed to the function, its wrapping.

for example, let's say that I have

@cache_decorator("my_key_{}".format(foo)) def my_function(foo, bar): pass @cache_decorator("another_key_{}_{}".format(foo, bar) def another_function(user, foo, bar): pass 

The goal is to write a caching shell. the decorator will need a cache key, but the key will include the variables passed to the function, and will be different for each function that it wraps.

Ideally, this allows the decorator to check the cached value for the given key, and if it is not found, execute the function to get the value and cache it. Thus, if the value is in the cache, it does not execute the code that creates the value (i.e. My_function). if it is not found, it executes my_function and stores the result in the cache and also returns it.

Another alternative would be something like blocks:

 def my_function(foo, bar): cache_value("my_key_{}".format(foo),{block of code to generate value that is only called if necessary}) 

in Objective-C or js, this will be a block, so I can keep the value generation both locally defined and mutable, but only necessary if necessary. I'm too new to python to fully understand how to do this with its close check.

Thanks!

Update
While the solution below worked for decorators, I ended up on a block-like route due to the additional metadata needed to attach to each cache entry to make sure it might be invalid. The presence of this metadata defined during value generation (as opposed to the internal caching function) is easier to maintain. It looks like this:

 def my_function(foo, bar): def value_func(): return code_to_generate_value_using_foo_bar return get_set_cache(key, value_func, ...) def get_set_cache(key, value_function, ...): value = cache.get(key) if value is None: value = value_function() cache.set(key, value) return value 
+4
source share
3 answers

You can make your shell get a key-building function:

 @cache_w_keyfunc(lambda foo, bar: (bar,)) def my_function(foo, bar): pass @cache_w_keyfunc(lambda user, foo, bar: (foo, bar)) def another_function(user, foo, bar): pass 

The key constructor should return things that can be used for hashing, such as a tuple of strings. If they are not hashed, for example, lists, perhaps convert them to strings.

This key-building function takes the same arguments as the function itself and returns the key that will be used.

 def cache_w_keyfunc(keyfunc): def real_decorator(func): func.cache = {} @functools.wraps(func) def wrapper(*args, **kwargs): # Create the key now out of the wrapped function name and the given keys: key = (func.__name__, keyfunc(*args, **kwargs)) try: return func.cache[cache_key] except KeyError: value = func(*args, **kwargs) func.cache.set(cache_key, value) return value return wrapper return real_decorator 
+3
source

When creating a decorator, you can pass two lists. The first should contain a list of positions for positional arguments, and the second should contain a list of parameter names for keyword arguments.

 def cached(positions, names): def cached_decorator(func): @functools.wraps(func) def wrapper(*args, **kwargs): keys = [func.__name__] + [str(kwargs.get(name)) for name in sorted(names)] + [str(args[position]) for position in positions] cache_key = '_'.join(keys) cached_value = cache.get(cache_key) if cached_value: return cached_value value = func(*args, **kwargs) cache.set(cache_key, value) return cached_value return wrapper return cached_decorator 

and you will use it like that

 # this will cache the function using b and name parameters @cached([1], ["name"]) def heavy_calc(a, b, c, name=None): something_realy_slow() return answer 

The problem is that you also have to serialize the response of the function and deserialize when retrieving from the cache. Another problem is two different calling functions that can give the same key ( heavy_calc("hello_there", "foo") and heavy_calc("hello", "there_foo") ). The solution to this is to serialize args and kwargs using json or msgpack so that you can be sure that the keys will be unique.

If you are using Python 3.3 and you do not need to select options for caching, you can use functools.lru_cache

+1
source

Have you seen dogpile.cache ?

This is a caching system that does just that.

Perhaps you can just use the dog. If not, you can look at the source to find out how it works.

By the way, dogpile.cache handles all the small details that you should worry about:

  • save keys separately
  • serialization / deserialization
  • expiration and re-certification
  • handling hits and cache misses
  • etc.
+1
source

Source: https://habr.com/ru/post/1502080/


All Articles