Google App Engine Application Delay

I improved my code and now the whole API is very fast, I also added memcache, and I have an excellent hit ratio. But sometimes I get meaningless delays.

I have attached here the most significant screenshot of appstats: more than 20 seconds to start 90 ms RPC; how is this possible? Where should I look for the source of these delays?

I am really stuck because I donโ€™t understand what is going on between RPCs and I donโ€™t know what else I can do to get more information.

Just a thought: every HTTP call is handled by the same GAE instance, right? Because my specimens took a long time to warm up. But I do not think it is connected.

BTW: I am coding in Java.

appstats statistics

+6
source share
2 answers

Usually unaccounted for for a โ€œholeโ€ in the middle of appstats is the execution of your code.
Appstats writes each rpc record and exits, and the areas it cannot write are your actual code.

Do you have logs for the time that the application has been between these two calls?

+2
source

A huge "inexplicable" latency almost always requires heating, absorbing resources. Check your application logs to see how many api_ms and cpu_ms are used to warm up.

You can avoid warming up by increasing the maximum expected delay in the appengine control panel. Enabling a higher delay means that requests will wait longer than starting a new instance. This can make each request a little slower, but you will avoid loading requests in heavy weight.

To help with warm-up requests, make sure your appengine-web.xml has:

<warmup-requests-enabled>true</warmup-requests-enabled> 

This will cause the application manager to proactively start new instances when the current ones are overloaded (that is, it starts loading before the request moves to a new instance}.

then in the affected slow servlets, make sure you upload the download to your web.xml:

 <servlet> <servlet-name>my-servlet</servlet-name> <servlet-class>com.company.MyServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> 

load-on-startup simply ensures that your high-priority servlets are always ready to go as soon as the warm-up request finishes.

+2
source

Source: https://habr.com/ru/post/914472/


All Articles