Late party, but considering gzip compression, you mostly compare ~ 51k download from Google CDN (197.14k content becomes 51.30k on-wire) versus ~ 15.5k from yours (assuming your 60k gzips files are in the same ratio, as the full jQuery user interface file, and that you have gzip compression enabled). This takes us to a difficult area:
- pre-existing copy of the cache
- latency
- transmission time
- number of requests
- correct cache headers
And the answer to your question is big: It depends, try each of them and measure the result in the real world.
Pre-existing Caching
If the first visitor to your site was previously on the site using the jQuery UI from the Google CDN and it is still in the cache that wins hands. Full stop. No need to think about it anymore. Google uses appropriate caching headers, and the browser doesn’t even have to send a request to the server, provided that you refer to a fully defined version of the jQuery user interface (none of the “any version 1.8.x matches” URLs If you ask jQuery UI 1.8 .16 , Google will return a resource that can be cached for up to a year, but if you ask for jQuery UI 1.8.x [for example, any dot rev 1.8], this resource is only good for an hour).
But let them not ...
Delay and Transmission Time
Delay is the time it takes to configure the connection to the server, and the transmission time is the time taken to transfer the resource. Using my DSL connection (I’m not very close to my exchange, so I usually get about 4 Mbit of download bandwidth, for example, this is normal, but nothing like what Londoners get, or those lucky FiOS people in the States ), repeated experiments loading a copy of Google jQuery UI Usually I spend ~ 50 ms waiting for connection (latency), and then 73 ms makes data transfer (SSL will change this profile, m, assuming a site without SSL here). Compare this to downloading a Google copy of jQuery itself (89.52k gzipped to 31.74k), which has the same delay of ~ 50 ms, followed by ~ 45 ms of download. Please note that the download time is proportional to the size of the resource (31.74k / 51.30k = 0.61871345, and, of course, 73ms x 0.61871345 = 45ms), but the latency is constant. Assuming your copy arrives at 15.5k, you can expect (for me) a latency of 50ms plus about 22ms of actual load. All things being equal, placing your own copy of 60 thousand copies against Google 200 thousand, you would save me 52 ms. Let's just say that I would not notice the difference.
All is not the same. Google CDN is highly optimized, location oriented and very fast. For example, you can compare jQuery downloads with Heroku.com. I chose them because they are smart people who manage a significant hosting business (currently using the AWS stack), and so you can expect them to at least spend some time optimizing the delivery of static content - and it happens that they use local A copy of jQuery for your website and they are in the USA (you will see why in an instant). If I download jQuery from them (awesome, they don't seem to have gzip!), The Latency is constantly in the range of 135 ms (with random outliers). This is 2.7 times the delay with respect to Google’s CDN (and my bandwidth from them is also about half the speed slower, maybe they only use AWS instances in the USA, and since I am in the UK, I’m further away from them).
The fact is that latency can well wash out any benefit that you get from a smaller file size.
Number of Inquiries
If you have any JavaScript files that you intend to host locally, your users will still have to get them. Let's say you have 100 thousand custom scripts for your site. If you use the Google CDN, your users should receive from you 200k of the jQuery user interface from Google and 100k of your script. The browser can transmit these requests in parallel (by prohibiting the use of async
or defer
in script
tags, the browser must execute scripts in the strict order of the document, but this does not mean that it cannot load them in parallel). Or it may not be so.
As we have already established, for users who are not mobile, with such sizes, the actual data transfer time is actually not so important, you may find that by taking a local jQuery user interface file and combining it with your own script, which requires only one downloads, rather than two, may be more efficient, despite the goodness of the Google CDN.
This old rule is "Only for one HTML file, one CSS file and one JavaScript file." Minimizing HTTP requests is Good ThingTM . Similarly, if you can use sprites rather than separate images for different things, this helps support image requests.
Valid Cache Headers
If you are hosting your own script, you need to be absolutely sure to cache it, which means paying attention to the cache headers. The Google CDN basically does not trust HTTP / 1.0 caches (it sets the Expires
header to the current date / time), but trusts HTTP / 1.1 caches - the vast majority - because it sends a max-age
header (year for fully defined resources). I assume that they have a reason for this, you might think about the following.
Since you sometimes want to change your scripts, you need to put the version number on them, for example. "my-nifty- script -1.js" and then "my-nifty- script -2.js", etc. That way you can set long max-age
headers, but know that when you update your script, your users will get a new one. (This also applies to CSS files.) Do not use the query string for version control, put the number actually in the resource name.
Since your HTML seems to be changing regularly, you probably want short expiration to end on it, but of course it depends entirely on your content.
Conclusion
It depends. . If you don't want to combine a script with a local copy of the jQuery user interface, you are probably better off using Google for the jQuery UI. If you are happy to combine them, you will want to do experiments in the real world in any way to make your own decision. It is possible that other factors wash it out, and it doesn't really matter. If you haven’t already done so, check out the Yahoo and Google speed tips pages: