Serve jQuery user interface from Google CDN or as a local copy?

while it’s better to serve jQuery from Google CDN, jQuery UI is a different beast. My local modified copy weighs 60 kilobytes and one on Google CDN ~ 200kb.

  • Are there any numbers on how many sites use CDN? (read: how many users have it in the cache). How to know / calculate whether it is better to service it locally?
+6
source share
5 answers

Late party, but considering gzip compression, you mostly compare ~ 51k download from Google CDN (197.14k content becomes 51.30k on-wire) versus ~ 15.5k from yours (assuming your 60k gzips files are in the same ratio, as the full jQuery user interface file, and that you have gzip compression enabled). This takes us to a difficult area:

  • pre-existing copy of the cache
  • latency
  • transmission time
  • number of requests
  • correct cache headers

And the answer to your question is big: It depends, try each of them and measure the result in the real world.

Pre-existing Caching

If the first visitor to your site was previously on the site using the jQuery UI from the Google CDN and it is still in the cache that wins hands. Full stop. No need to think about it anymore. Google uses appropriate caching headers, and the browser doesn’t even have to send a request to the server, provided that you refer to a fully defined version of the jQuery user interface (none of the “any version 1.8.x matches” URLs If you ask jQuery UI 1.8 .16 , Google will return a resource that can be cached for up to a year, but if you ask for jQuery UI 1.8.x [for example, any dot rev 1.8], this resource is only good for an hour).

But let them not ...

Delay and Transmission Time

Delay is the time it takes to configure the connection to the server, and the transmission time is the time taken to transfer the resource. Using my DSL connection (I’m not very close to my exchange, so I usually get about 4 Mbit of download bandwidth, for example, this is normal, but nothing like what Londoners get, or those lucky FiOS people in the States ), repeated experiments loading a copy of Google jQuery UI Usually I spend ~ 50 ms waiting for connection (latency), and then 73 ms makes data transfer (SSL will change this profile, m, assuming a site without SSL here). Compare this to downloading a Google copy of jQuery itself (89.52k gzipped to 31.74k), which has the same delay of ~ 50 ms, followed by ~ 45 ms of download. Please note that the download time is proportional to the size of the resource (31.74k / 51.30k = 0.61871345, and, of course, 73ms x 0.61871345 = 45ms), but the latency is constant. Assuming your copy arrives at 15.5k, you can expect (for me) a latency of 50ms plus about 22ms of actual load. All things being equal, placing your own copy of 60 thousand copies against Google 200 thousand, you would save me 52 ms. Let's just say that I would not notice the difference.

All is not the same. Google CDN is highly optimized, location oriented and very fast. For example, you can compare jQuery downloads with Heroku.com. I chose them because they are smart people who manage a significant hosting business (currently using the AWS stack), and so you can expect them to at least spend some time optimizing the delivery of static content - and it happens that they use local A copy of jQuery for your website and they are in the USA (you will see why in an instant). If I download jQuery from them (awesome, they don't seem to have gzip!), The Latency is constantly in the range of 135 ms (with random outliers). This is 2.7 times the delay with respect to Google’s CDN (and my bandwidth from them is also about half the speed slower, maybe they only use AWS instances in the USA, and since I am in the UK, I’m further away from them).

The fact is that latency can well wash out any benefit that you get from a smaller file size.

Number of Inquiries

If you have any JavaScript files that you intend to host locally, your users will still have to get them. Let's say you have 100 thousand custom scripts for your site. If you use the Google CDN, your users should receive from you 200k of the jQuery user interface from Google and 100k of your script. The browser can transmit these requests in parallel (by prohibiting the use of async or defer in script tags, the browser must execute scripts in the strict order of the document, but this does not mean that it cannot load them in parallel). Or it may not be so.

As we have already established, for users who are not mobile, with such sizes, the actual data transfer time is actually not so important, you may find that by taking a local jQuery user interface file and combining it with your own script, which requires only one downloads, rather than two, may be more efficient, despite the goodness of the Google CDN.

This old rule is "Only for one HTML file, one CSS file and one JavaScript file." Minimizing HTTP requests is Good ThingTM . Similarly, if you can use sprites rather than separate images for different things, this helps support image requests.

Valid Cache Headers

If you are hosting your own script, you need to be absolutely sure to cache it, which means paying attention to the cache headers. The Google CDN basically does not trust HTTP / 1.0 caches (it sets the Expires header to the current date / time), but trusts HTTP / 1.1 caches - the vast majority - because it sends a max-age header (year for fully defined resources). I assume that they have a reason for this, you might think about the following.

Since you sometimes want to change your scripts, you need to put the version number on them, for example. "my-nifty- script -1.js" and then "my-nifty- script -2.js", etc. That way you can set long max-age headers, but know that when you update your script, your users will get a new one. (This also applies to CSS files.) Do not use the query string for version control, put the number actually in the resource name.

Since your HTML seems to be changing regularly, you probably want short expiration to end on it, but of course it depends entirely on your content.

Conclusion

It depends. . If you don't want to combine a script with a local copy of the jQuery user interface, you are probably better off using Google for the jQuery UI. If you are happy to combine them, you will want to do experiments in the real world in any way to make your own decision. It is possible that other factors wash it out, and it doesn't really matter. If you haven’t already done so, check out the Yahoo and Google speed tips pages:

+9
source

The Google CDN jquery user interface weighs 51kb:

https://ajax.googleapis.com/ajax/libs/jqueryui/1.8.16/jquery-ui.min.js

HTML5 Boilerplate uses backup loading to load jquery:

 <!-- Grab Google CDN jQuery, with a protocol relative URL; fall back to local if necessary --> <script src="//ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js"></script> <script>window.jQuery || document.write('<script src="js/libs/jquery-1.5.1.min.js">\x3C/script>')</script> 

You can apply it to jquery ui:

 <script src="//ajax.googleapis.com/ajax/libs/jqueryui/1.8.16/jquery-ui.min.js"></script> <script>window.jQuery.ui || document.write('<script src="js/jquery-ui-1.8.16.min.js">\x3C/script>')</script> 

You download the CDN version and then check for jquery ui (you cannot guarantee 100% time for any CDN). If jquery ui does not exist, go back to local. Thus, if they are already in the cache, you are good to go. If they do not, and the CDN cannot be restored for any reason, your good deed is with your local. Fault tolerance.

+9
source

I think the size comparisons do not match the CDN point. By submitting a copy of jQuery (or another library) from a publicly available, commonly used CDN, many users will have a cached copy of the library before they arrive on your site. When they do this, the effective download size is 0 KB, compared to 60 KB from your server.

The Google CDN is the most widely used, so you will have the best chance of getting into the cache if you access it.

For numbers comparing different CDNs, see this article .

For what it's worth, the smaller version of the Google jQuery copy is much smaller than the size you specify.

+2
source

I would say how important your load on your server is. It does not matter for the user whether they are downloaded from the server or from the google server. These days, 140kb of bandwidth is enough to make it easy to ignore on the user side.

Now the real question is that you made changes to the jQuery user interface. If so, you must file your own copy. If not, then it's normal to serve Google. Because, in the end, you are trying to reduce the load on your side.

And besides, caching occurs not only in the user's browser, but also on the content distribution nodes that they access. Therefore, it is safe to say that a copy of google is almost certainly cached.

+1
source

With the size of this small, the number of HTTP requests for the first visitor to your site matters.

If, for example, your site has a script combination and mini-configuration, so the entire script for the first visitor is either a single request or included in the html itself, using your local copy is better, because even a cached copy of JqueryUI is not faster than the whole script for site displayed immediately (the cached call should still go out and check for Modified).

If you do not have a good minix script layout and configuration (so you are going to send jqueryui separately, either from your site or to another place), use external caches if possible.

+1
source

Source: https://habr.com/ru/post/899358/


All Articles