Is it possible to compress a JPEG file with zip libraries

As I know, a jpeg file has a better compression ratio between other image extensions, and if I fix it, we won’t be able to compress the jpeg file because it is the best compression, so please help me about it. I create several jpegs as follows:

ImageCodecInfo[] codecs = ImageCodecInfo.GetImageEncoders(); ImageCodecInfo ici = null; foreach(ImageCodecInfo codec in codecs) { if(codec.MimeType == "image/jpeg") ici = codec; } EncoderParameters ep = new EncoderParameters(); ep.Param[0] = new EncoderParameter(System.Drawing.Imaging.Encoder.Quality, _quality); using(MemoryStream ms = new MemoryStream()) { Bitmap capture = GetImage(); capture.Save(ms, ici, ep); } 

And I fixed them with sharpziplib, on average each jpeg size is 130 KB and after zip each file is compressed to 70 KB, how is this possible? there are only 2 answers that I can imagine.

1- We can compress a jpeg file with a large compression ratio for zip libraries

2- My jpeg file is not created correctly, and we can create better jpeg (with a higher degree of compression, since we can no longer compress them using zip libraries)

Does anyone know about this? if we can create the best jpegs, please help me about this.

Edit:

this is my zip code for jpegs compression:

 void addnewentry(MemoryStream stream, string pass, string ZipFilePath, string entryname){ ICSharpCode.SharpZipLib.Zip.ZipFile zf = new ZipFile(ZipFilePath); if(!String.IsNullOrEmpty(pass)) zf.Password = pass; StaticDataSource sds = new StaticDataSource(Stream); zf.BeginUpdate(); zf.Add(sds, entryName); zf.CommitUpdate(); zf.IsStreamOwner = true; zf.Close(); } public class StaticDataSource : IStaticDataSource { public Stream stream { get; set; } public StaticDataSource() { this.stream.Position = 0; } public StaticDataSource(Stream stream) { this.stream = stream; this.stream.Position = 0; } public Stream GetSource() { this.stream.Position = 0; return stream; } } 
+4
source share
5 answers

As already noted by most people, you cannot compress such already compressed files. Some people work hard on recompressing JPEG (recompression = partially decoding an already compressed file, and then compress this data with a custom strong model and entropy encoder. Recompression usually provides identical bit-wise results). Even such advanced recompression techniques, I only saw 25% improvement. PackJPG is one of them. You can look at other compressors here . As you know, even a compressor of the highest rank could not reach exactly 25% (although it is very difficult).

Taking these facts into consideration, ZIP (actually deflate) cannot significantly improve compression (it is very old and inefficient when compared with 10 compressors). I believe there are two possible reasons for this problem:

  • You accidentally add some extra data to the JPEG stream (possibly adding after the JPEG stream).
  • .NET dumps a lot of redundant data into a JFIF file. Maybe some big EXIF ​​data etc.

To solve this problem, you can use the JFIF dump tool to monitor what's inside the JFIF container. Alternatively, you can try JPEG files with PackJPG.

+3
source

The JPEG compression algorithm has two stages: the “lossy” stage, where visual elements that should be invisible to the human eye are deleted, and “lossless”, where the rest of the data is compressed using a method called Huffman coding. After Huffman coding, further lossless compression methods (such as ZIP) will not significantly reduce the size of the image file.

However, if you were to archive multiple copies of the same small image together, the ZIP ("DEFLATE") algorithm recognizes data repetition and uses it to reduce the total file size to less than the sum of the size of the individual files. This may be what you see in your experiment.

Formulated very simply, lossless methods, such as Huffman coding (part of JPEG) and DEFLATE (used in ZIP), try to detect duplicate patterns in your source data and then present these duplicate patterns using shorter codes.

In short, you cannot really improve JPEG by adding another lossless compression step.

+2
source

No one mentioned the fact that JPEG is just a container. There are many compression methods that can be used with this file format (JFIF, JPEG-2000, JPEG-LS, etc.). Further compression of this file may produce different results depending on the contents. In addition, some cameras store a huge amount of EXIF ​​data (sometimes resulting in approximately 20 KB of data), and this may explain the difference you see.

+2
source

You can try to compress something with zlib. You just do not always get a size reduction.

Usually compressing an entire jpeg file will give a small saving in bytes as it compresses the jpeg header (including any text comments or EXIF)

This may not fully account for the 40K compression that you see if you don't have a lot of header data, or your jpeg data somehow ends up with a lot of duplicate values ​​inside.

+1
source

Zipping JPEG reduces size because : EXIF ​​data is not compressed, JPEG is optimized for photos, not GIF-like data, and file compression creates a single data stream, allowing templates across multiple files and eliminating the requirement that each one be bound to a specific block on the disk. The latter can independently save about 4 KB per compressed file.

The main problem with compressing pre-compressed images is that additional work (man and a processor) is required to prepare and view, which may not be worth the effort (if you do not have millions of images, or some automated image service that you develop).

The best approach is to minimize the size of your own file , forget zip. There are many free libraries and applications out there to help with this. For example, ImageOptim combines several libraries into one (OptiPNG, PNGCrush, Zopfli, AdvPNG, Gifsicle, PNGOUT) to block aggressive tricks to minimize size. Great for PNG; have not tried much using jpeg.

Although remember that with any compression there is always a point of diminishing results. It is up to you to decide if a few extra bytes really matter in the end.

0
source

Source: https://habr.com/ru/post/1389877/


All Articles