Is there significant potential for having as few script files in assets as possible?

Given that:

  • The asset pipeline is quite complex (coming from ASP.NET MVC, this one is much more complicated): How do I link a CoffeeScript file to a view?
  • All js files will be uploaded to each page, and each "ready" jquery method in each file will be executed

Does it make sense to break javascript for the application into "areas" - one js file for the user / public area of โ€‹โ€‹the site, one file for the site administration area, etc. Scripts coming from external sites (jquery, other sites in org) are not really used.

The surfaces that I see

  • Performance: Downloading a single file of 80 KB in size is much faster than downloading 4 files of 20 KB due to TCP / IP overhead ( source ). During the production process, all public js will be compiled into one file and submitted for each request, but I see that there is an โ€œadminโ€ code in this file.

  • Security: there may be some things in the script files that I donโ€™t want to disclose to unauthorized users. Of course, if it is not protected in the script, but it would be nice if I could minimize the impact of such things as paths to controller actions that perform database maintenance.

  • Simplicity in development: if the finished event will be fired from each file, it makes sense for me to just put it in one file, so I donโ€™t have to upload every file to see what is there. Js for subzones (e.g. admin) will naturally be in a separate file that will not come from assets

Related: Put javascript in a single .js file or split it into multiple .js files?

+4
source share
1 answer

Separating a javascript admin from another javascript sounds like a good idea.

This performance article is definitely correct. I did not see a situation when caching, since a lot of javascript in one file was not a winner. Even if there is no javascript sharing in the zones, downloading a large file immediately becomes faster than several downloads.

If you have many libraries and javascript files with separate pages, perhaps caching all the libraries makes sense, just make sure that you measure the number of cache accesses for the average user during testing.

There are templates that can be used to minimize download problems. For me, there was an effective method of dividing partitions into modules and only initializing these modules based on function detection.

For example, if I have a UserInfo module:

 !function(ns) { ns.init = function() { ns.setup_login() ns.show_user_info() } ns.setup_login = function() { /* blah-blah */ } // ... etc }.call(this, this.UserInfo={}) 

And if I have html on the login page:

 <div id="user_login"> <div class="user_info"></div> <div class="login_links"></div> </div> 

I can write an initializer like:

 $(function() { $("#user_login").each(function() { UserInfo.init() }) }) 

Without this template, I would write loading setup_login and show_user_info separately. This usually allows me to initialize several different modules based on what aspects I find on the page, and if you group these modules according to their dependencies, which usually reduce it even further. (I could do User.init UserInfo.init , since I could assume that UserInfo depends on User .)

0
source

Source: https://habr.com/ru/post/1383321/


All Articles