What is the best way to efficiently convert a large number of images with a fixed position in WebKit?

I am currently working on a small site for my family. One of the things I wanted to do was make the basic “create” video with a stop. I could put it together and upload to Vimeo or something, but I thought it was a great opportunity to use only HTML, CSS and Javascript.

I have everything in style, my JS works, etc., except that it works terribly in Chrome and Safari. Interestingly, it works great in Firefox, and I don't support it in IE yet. I hope for 8 to 12 frames per second, with music that I did not bother trying because of this. Poor performance is something less. Currently, I get about 3 fps in Firefox (valid, but not what I was looking for), but in Chrome and Safari I get roughly .6795 fps .

When Chrome Profiler starts up, I get the following (corresponding) output.

 99.96% 99.96% (program) 0.03% 0.03% (garbage collector) 0.01% 0.01% script.js:5:nextSlide 

I have never used Profiler before, but I believe this shows me that my JS is not so hard on performance.

I published a test page that documents the performance differences that you can visit with Chrome and Firefox.

I also found that this seems to be related to circular images. Interacting with other, simpler images seems to work just fine in both Chrome and Firefox, even though Chrome is still a bit more hungry than Firefox.

As further evidence, at least this conclusion, although completely unacceptable, is demonstrated here , after launching images via convert -compress JPEG -quality 1 . They work much more efficiently, but, of course, the quality is terrible.

I ran these test pages in Chrome ( 16.0.912.63 ), Safari ( 5.1.2 (6534.52.7) ), WebKit Version 5.1.2 (6534.52.7, r102985) ( Version 5.1.2 (6534.52.7, r102985) ) and Mobile Safari ( latest as of 2011/12/28 ), and only Mobile Safari also runs FireFox. Desktop browsers have been tested on the MacBook Pro.

 2.7 GHz Intel Core i7 8 GB 1333 MHz DDR3 

Interestingly, Mobile Safari on iPad 2 also runs FireFox when rendering a test page . Although Mobile Safari is based on WebKit, in this case it does a completely different thing.

Lowering the setTimeout call to 144 out of 244 also does nothing. I came to 244 completely randomly at this point, as it became clear that the display time compared to the call almost did not correspond. It makes me think that I am showing slide shows as fast as I can in every browser.

So my question is: how can I do this work in WebKit?

+6
source share
8 answers

You can debug page performance in Chrome using the Timeline tab in the Chrome Developer Tools. The problem with your script is that your redraw cycle is just too expensive, currently it takes 1.35 s to recolor every frame.

enter image description here

Poor performance has nothing to do with the quality of jpeg images (although image quality also affects page rendering time). The problem is that you are updating the z-index, which causes Chrome to redraw all the images, not just the next frame (you have the O (n) image slider website!).

Browsers try to perform the minimum possible actions in response to a change, for example: changing the color of elements will only result in redrawing the element.

Changing the z-index property of an element is basically the same as removing a node from the tree and adding another node to it. This will result in the layout and redrawing of the element, its children, and possibly brothers and sisters. I assume that in Chrome, siblings are also repainted, this explains the terrible performance.

A way to fix this problem is to update the opacity property instead of z-index . Unlike z-index , opacity does not change the DOM tree. It only tells the renderer to ignore this element. The element is still “physically” present in the DOM. This means that only one element is redrawn, and not all brothers and sisters and children.

These simple CSS changes should do the trick:

 .making-of .slide#slide-top { opacity: 1; /* z-index: 5000; */ } .making-of .slide { position: fixed; /* z-index: 4000; */ opacity: 0; .... } 

And as a result, redrawing from 1.35 to 1 ms:

enter image description here

EDIT:

Here is jsfiddle using opacity solution, I also added CSS3 transitions (just for fun!)

http://jsfiddle.net/KN7Y5/3/

Learn more about how browser rendering works:

http://www.html5rocks.com/en/tutorials/internals/howbrowserswork/

+12
source

I looked at the code on your site and found two things that limit speed.

1) In JavaScript, you have a timeout of about 1/4 second (244 milliseconds). This means that your best frame rate is around 4 frames per second (frames per second). This can be eliminated simply by reducing the delay according to the frame rate that you really want. I see that your last edit touches on this point, but I did not want to ignore it, since it is ultimately crucial to achieve the higher frame rates you want.

2) You use z-index to control the image. In general, z-index processing allows you to arrange objects that have different sizes and positions, so you can control which object is displayed in places where two or more objects overlap. In your case, all objects completely overlap, and the z-index approach works fine, except for one serious problem: browsers do not optimize the processing of the z-index for this case, and therefore they actually process each image on each frame . I checked this by creating a modified version of your demo that used twice as many images - FPS was reduced almost 2 times (in other words, it took 4 times more time to display the entire set).

I hacked into an alternative approach that reached a much higher level of FPS (60 or more) under Chrome and Firefox. The bottom line is that I used the display property instead of manipulating the z-index:

 .making-of .slide#other { display: none; } .making-of .slide#slide-top { display: inline; } 

and JavaScript:

 function nextSlide() { ... topSlide.id='other'; nextTopSlide.id='slide-top'; ... setTimeout(nextSlide, 1); ... } 

I also made some changes to the HTML, including id="other" in the tag for each image.


So why is webkit so slow? As pointed out in other comments, the ultra-fast performance that you see in Webkit seems to be Mac specific. My best guess about this is that the Mac version of WebKit does not actually use the turbo-charged libjpeg version (even though it is listed in the credits). In your test, JPEG decompression could very well be a gating factor if it actually decompresses every image on every frame (as it probably is). Benchmarking libjpeg-turbo showed an approximately 5-fold improvement in decompression speed. This roughly corresponds to the difference you see between Firefox and Chrome (3 FPS versus 0.6795 FPS).

For more on libjpeg-turbo and how this hypothesis explains some of your other results, see my other answer .

+5
source

The key to my experience is to keep as few images as possible in the DOM and in javascript arrays, so don't download everything at once, keep it to a minimum. Also make sure that you destroy the already used DOM elements, as well as javascript objects containing images, collecting garbage manually. This will improve performance.

+1
source

Random guess: GPU acceleration. It depends on the device, and now there is a big race among browsers.

You can try using more recent Chrome, such as canaries, http://tools.google.com/dlpage/chromesxs (it's now 18.x) to get more data.

about: the version in Chrome should provide you with a version of WebKit.

Also, have you tried existing slide show solutions like http://jquery.malsup.com/cycle/ ? I wonder if the game with z-index is the bottleneck here ... maybe only 1-2 displayable images will be displayed (everything else with display: none). This is again an assumption.

+1
source

Some relatively recent work has taken place with the JPEG image compression library, which is used in many applications, including browsers such as Firefox and Chrome. This new library achieves a significant increase in speed through the use of special multimedia processing instructions available in modern processors. Maybe your version of Chrome is just not using the new library.

Your question asks for a way to fix your images, but this is not necessary. In the end, some other browsers work fine. Therefore, the fix should be in the browser (and browsers are constantly improving).

You said you improved the speed of Chrome by significantly lowering the quality or complexity of your images. This can be explained by the fact that for areas of very low detail, the JPEG decompression algorithm can bypass most of the work that it usually needs to do. If an 8x8 pixel can be reduced to one color, then decompressing this tile will be a very simple matter.

This Wikipedia article contains additional information and sources. It says Chrome version 11 has a new library. You can enter " chrome: // credits " in your location bar and see if it refers to " libjpeg-turbo ". "libjpeg" is the original library, and "libjpeg-turbo" is the optimized version.

Another possibility is that libjpeg-turbo is not supported on Webkit on Mac (although I don't know that specifically). There is a clue as to why this could happen in the case here .

PS You can get a better decompression speed by compressing using another algorithm such as PNG (although your compression ratios are likely to suffer). On the other hand, perhaps you should use HTML5 video , perhaps with the WebM format .

0
source

The best way to achieve better performance when it comes to graphics is to compress them, but as you want, but keep

If you are using Linux, I previously used the compression tool http://linuxpoison.blogspot.com/2011/01/utility-to-optimize-compress-jpeg-files.html . This does not hurt quality, as the ImageMagick example showed.

Also http://trimage.org/ supports JPG and will be my first recommendation!

If you're on Windows, maybe something like this: http://www.trans4mind.com/personal_development/convertImage/index.html

I have not tested the Windows method, and I'm not even sure that it supports batch

Hope this helps!

PS For PNG, I sometimes use http://pmt.sourceforge.net/pngcrush/ with or without http://trimage.org/

0
source

I tested it in the opera, and it worked slowly, damn it, I noticed that the opera queued 150+ images for upload, maybe it’s worth trying to download ~ 20 at a time?

0
source

An alternative approach would be to make this content as video - it is ideal for this kind of thing and can easily contain audio and subtitles, you can access each pixel from each frame using JavaScript, if you want to scare.

0
source

Source: https://habr.com/ru/post/903864/


All Articles