Ceiling Protective Decorator

I am trying to create a memoize decorator that works with multiple threads.

I realized that I need to use the cache as a shared object between threads and acquire / block a shared object. Of course, I run threads:

for i in range(5): thread = threading.Thread(target=self.worker, args=(self.call_queue,)) thread.daemon = True thread.start() 

where is the worker:

 def worker(self, call): func, args, kwargs = call.get() self.returns.put(func(*args, **kwargs)) call.task_done() 

The problem starts, of course, when I send a function decorated with a memo function (like this ) for many threads at the same time.

How can I implement memo cache as a common object among threads?

+4
source share
1 answer

The easiest way is to use a single lock for the entire cache and require that all entries in the cache capture the lock first.

In the code example above, on line 31, you will get a lock and check to see if this result is still missing, in which case you will continue to compute and cache the result. Something like that:

 lock = threading.Lock() ... except KeyError: with lock: if key in self.cache: v = self.cache[key] else: v = self.cache[key] = f(*args,**kwargs),time.time() 

The above example stores the cache for each function in the dictionary, so you also need to store a lock for each function.

If you use this code in a very controversial environment, although it would probably be unacceptably inefficient, since the threads will have to wait from each other, even if they did not calculate the same. You could probably improve this by storing the key lock in the cache. You also need to globally block access to the lock store, or is there a race condition when creating locks for each key.

+2
source

Source: https://habr.com/ru/post/1442225/


All Articles