Firebug not only shows the time it takes for individual requests, but also breaks them into phases. Most of the time spent getting small files (~ 20 KB) is spent waiting for a response (at least according to Firebug).
In stackoverflow, for example, waiting for a response to / takes 255 ms, transmits 42 ms. On other sites, I saw figures such as: a 200 ms wait response and 1 ms transmission. What causes the expectation?
Web sites usually consist of many files: html document, css, js, some images. Take any of the demos here, the dojox gfx demos , most of the time is spent migrating tiny js files. This whole model seems very inefficient to me.
source
share