I have a mobile application (iOS) that sends instructions through a Comet server ( APE ) to a web application (JS). For each command, the web application responds with a "ack" message, marked with the command identifier. I want to calculate the average response time for a web application.
The frequency of commands can vary from 5 per second to any other second (or even longer, depending on the user).
My naive solution is to mark the time of each dispatch and receipt, and then calculate the average value between the differences. This is very inefficient, since the algorithm (basic for the loop) stops the application and causes a delay in data processing. Another solution is to use the last ten timestamps and thus limit the number of responses to calculate.
However, I am not happy with this decision and I am looking for some reference materials that could provide me with any information about the problem that I encountered.
source share