Lay the entity sequentially slowly on the Python Cloud data warehouse

I use Google Cloud Datastore through the Python library in a flexible Python 3 application environment. My flash application creates an object and then adds it to the data store:

ds = datastore.Client()
ds.put(entity)

In my testing, each call puttakes 0.5-1.5 seconds. This does not change if I make two calls at once, one after another, for example here .

I am wondering if the complexity of my object is a problem. This is a multilayer something like:

object = {
    a: 1,
    ...,
    b: [
        {
            d: 2,
            ...,
            e: {
                h: 3
            }
        }
    ],
    c: [
        {
            f: 4,
            ...,
            g: {
                i: 5
            }
        }
    ]
}

which I create by nesting datastore.Entity, each of which is initialized with something like:

entity = datastore.Entity(key=ds.key(KIND))
entity.update(object_dictionary)

Both lists contain 3-4 items. The JSON equivalent of an object is ~ 2-3kb.

Is this not a recommended practice? What should I do instead?

Additional Information:

put Entity . put - put_multi. put_multi , batch, Entity, batch.

"Name/ID" ( - ). , :

datastore.key(KIND)

KIND - . :

datastore.key(KIND, <some ID>)

, , . , (: id = 4669294231158784, id = 4686973524508672).

100% , ( " " ), , , " " "(.. ). - " ", -. , - ?

, , .

+4
2

( ), :

Cloud Datastore , Cloud Datastore.

, . , .

client.put_multi([task1, task2])
+1

, , "put".

"" , () . Datastore , . , , , () . , , , .

"" , - " ". , "" , .

+1

Source: https://habr.com/ru/post/1693536/


All Articles