Maximize Parallel Downloads for a Website

I read articles about website acceleration by providing static cookieless content. We have an ASP.NET site with links to images / css / js, for example

<script type="text/javascript" src="/js/something.js"></script>

I tested the static content filter from this article and it seems to work fine for situations like the ones described above. However, we also have many CSS files with styles such as:

background-image: url(/images/something.jpg)

The static content filter will not work for these situations. Since there are many of our image locations in our CSS files, is there a good way around this?

Whenever a project is uploaded to our local development machines, we obviously want all the files to be sent from localhost, so we cannot hardcode all of these locations.

Is there another solution or is there something we can change to make this work?

+3
source share
1 answer

You will need to modify your CSS files. You may need to create a “deploy” script that modifies files on the fly before transferring them from your development machine to the server, but you are not going to get around the fact that the full path must be hard-coded into the CSS server.

(Unless, of course, you download all your images using javascript and then change your styles with it, etc., an approach that has its problems)

+2
source

Source: https://habr.com/ru/post/1713395/


All Articles