1 MB Memcache Limit on Google App Engine

How do you store an object larger than 1 MB in memcache? Is there a way to split it, but data with the same key is still available?

+3
source share
5 answers

There are memcache methods set_multi and get_multi that take a dictionary and a prefix as arguments.

If you can split your data into a chunk dictionary, you can use this. Basically, the prefix will become your new name.

You will need to somehow track the names of the pieces. In addition, ANY of the pieces can be evicted from memcache at any time, so you will also need a little to recover partial data.

+2

( "blobcache" ) 1 memcache GAE.

import pickle
import random
from google.appengine.api import memcache


MEMCACHE_MAX_ITEM_SIZE = 900 * 1024


def delete(key):
  chunk_keys = memcache.get(key)
  if chunk_keys is None:
    return False
  chunk_keys.append(key)
  memcache.delete_multi(chunk_keys)
  return True


def set(key, value):
  pickled_value = pickle.dumps(value)

  # delete previous entity with the given key
  # in order to conserve available memcache space.
  delete(key)

  pickled_value_size = len(pickled_value)
  chunk_keys = []
  for pos in range(0, pickled_value_size, MEMCACHE_MAX_ITEM_SIZE):
    # TODO: use memcache.set_multi() for speedup, but don't forget
    # about batch operation size limit (32Mb currently).
    chunk = pickled_value[pos:pos + chunk_size]

    # the pos is used for reliable distinction between chunk keys.
    # the random suffix is used as a counter-measure for distinction
    # between different values, which can be simultaneously written
    # under the same key.
    chunk_key = '%s%d%d' % (key, pos, random.getrandbits(31))

    is_success = memcache.set(chunk_key, chunk)
    if not is_success:
      return False
    chunk_keys.append(chunk_key)
  return memcache.set(key, chunk_keys)


def get(key):
  chunk_keys = memcache.get(key)
  if chunk_keys is None:
    return None
  chunks = []
  for chunk_key in chunk_keys:
    # TODO: use memcache.get_multi() for speedup.
    # Don't forget about the batch operation size limit (currently 32Mb).
    chunk = memcache.get(chunk_key)
    if chunk is None:
      return None
    chunks.append(chunk)
  pickled_value = ''.join(chunks)
  try:
    return pickle.loads(pickled_value)
  except Exception:
    return None
+7

memcache , set_multi get_multi, .

, , - .

, , , .

blobstore GAE, , Google Storage.

0

, memcache . , , , 32 , , 1mb. , , Google . , memcache .

googling python compress string ​​memcache.

, , , , memcache.

0
source

A good workaround is to use layer_cache.py, a python class written and used by the Khan Academy (open source). This is mainly a combination of in-memory cache (caching module) with memcache, which is used as a way to synchronize the cache in memory through instances. find the source here and read Ben Kamens's blog post here .

0
source

Source: https://habr.com/ru/post/1793922/


All Articles