Is there a way to do image compression and save faster on Android?

Situation

I need to show animation of 200-350 frames in my application. Images have a resolution of 500x300. If the user wants to share the animation, I need to convert it to video. For conversion, I use the ffmpeg command.

ffmpeg -y -r 1 -i /sdcard/videokit/pic00%d.jpg -i /sdcard/videokit/in.mp3 -strict experimental -ar 44100 -ac 2 -ab 256k -b 2097152 -ar 22050 -vcodec mpeg4 -b 2097152 -s 320x240 /sdcard/videokit/out.mp4 

To convert images to video, ffmpeg wants the actual files to not be bitmap or byte [].

Problem

Compressing bitmap images into time-consuming image files. 210 image conversion takes about 1 minute to complete on an average device (HTC ONE m7). Converting image files to mp4 takes about 15 seconds on one device. All together, the user needs to wait about 1.5 minutes.

What i tried

  • I changed the format of the comrpession PNG format to JPEG (the result of 1.5 minutes is achieved with JPEG compression (quality = 80), while PNG takes 2-2.5 minutes) Success
  • I tried to find how to pass byte [] or a bitmap to ffmpeg - no.

Question

  • Is there a way (library (even native)) to speed up the save process.
  • Is there a way to transfer bytes [] or Bitmap objects (I mean a png file decompressed into an Android Bitmap Class object) into the ffmpeg video file creation method.
  • Is there any other working library that will create mp4 (or any supported format (supported by major social networks)) from bytes [] or Bitmap objects in about 30 seconds (for 200 frames).
+5
source share
3 answers

There are two steps that slow us down. Compress images in PNG / JPG and write them to disk. Both can be skipped if we directly encode the ffmpeg libs files instead of calling the ffmpeg command. (There are other improvements like GPU coding and multithreading, but much more complex.)

Some approaches to the code:

  • Use only C / C ++ NDK for android programming. FFmpeg will happily work. But I think this is not an option here.
  • Build it from scratch Java JNI. There is not much experience. I only know that this can bind java with c / C ++ libs.
  • Some java shell. Fortunately, I found javacpp-presets . (There are others, but it's good enough and relevant.)

This library includes a good example , ported from the famous janger fpermg tutorial, although it is a demo.

We can try to write a multiplexer following the example of ffmpeg muxing.c .

 import java.io.*; import org.bytedeco.javacpp.*; import static org.bytedeco.javacpp.avcodec.*; import static org.bytedeco.javacpp.avformat.*; import static org.bytedeco.javacpp.avutil.*; import static org.bytedeco.javacpp.swscale.*; public class Muxer { public class OutputStream { public AVStream Stream; public AVCodecContext Ctx; public AVFrame Frame; public SwsContext SwsCtx; public void setStream(AVStream s) { this.Stream = s; } public AVStream getStream() { return this.Stream; } public void setCodecCtx(AVCodecContext c) { this.Ctx = c; } public AVCodecContext getCodecCtx() { return this.Ctx; } public void setFrame(AVFrame f) { this.Frame = f; } public AVFrame getFrame() { return this.Frame; } public OutputStream() { Stream = null; Ctx = null; Frame = null; SwsCtx = null; } } public static void main(String[] args) throws IOException { Muxer t = new Muxer(); OutputStream VideoSt = t.new OutputStream(); AVOutputFormat Fmt = null; AVFormatContext FmtCtx = new AVFormatContext(null); AVCodec VideoCodec = null; AVDictionary Opt = null; SwsContext SwsCtx = null; AVPacket Pkt = new AVPacket(); int GotOutput; int InLineSize[] = new int[1]; String FilePath = "/path/xxx.mp4"; avformat_alloc_output_context2(FmtCtx, null, null, FilePath); Fmt = FmtCtx.oformat(); AVCodec codec = avcodec_find_encoder_by_name("libx264"); av_format_set_video_codec(FmtCtx, codec); VideoCodec = avcodec_find_encoder(Fmt.video_codec()); VideoSt.setStream(avformat_new_stream(FmtCtx, null)); AVStream stream = VideoSt.getStream(); VideoSt.getStream().id(FmtCtx.nb_streams() - 1); VideoSt.setCodecCtx(avcodec_alloc_context3(VideoCodec)); VideoSt.getCodecCtx().codec_id(Fmt.video_codec()); VideoSt.getCodecCtx().bit_rate(5120000); VideoSt.getCodecCtx().width(1920); VideoSt.getCodecCtx().height(1080); AVRational fps = new AVRational(); fps.den(25); fps.num(1); VideoSt.getStream().time_base(fps); VideoSt.getCodecCtx().time_base(fps); VideoSt.getCodecCtx().gop_size(10); VideoSt.getCodecCtx().max_b_frames(); VideoSt.getCodecCtx().pix_fmt(AV_PIX_FMT_YUV420P); if ((FmtCtx.oformat().flags() & AVFMT_GLOBALHEADER) != 0) VideoSt.getCodecCtx().flags(VideoSt.getCodecCtx().flags() | AV_CODEC_FLAG_GLOBAL_HEADER); avcodec_open2(VideoSt.getCodecCtx(), VideoCodec, Opt); VideoSt.setFrame(av_frame_alloc()); VideoSt.getFrame().format(VideoSt.getCodecCtx().pix_fmt()); VideoSt.getFrame().width(1920); VideoSt.getFrame().height(1080); av_frame_get_buffer(VideoSt.getFrame(), 32); // should be at least Long or even BigInteger // it is a unsigned long in C int nextpts = 0; av_dump_format(FmtCtx, 0, FilePath, 1); avio_open(FmtCtx.pb(), FilePath, AVIO_FLAG_WRITE); avformat_write_header(FmtCtx, Opt); int[] got_output = { 0 }; while (still_has_input) { // convert or directly copy your Bytes[] into VideoSt.Frame here // AVFrame structure has two important data fields: // AVFrame.data (uint8_t*[]) and AVFrame.linesize (int[]) // data includes pixel values in some formats and linesize is size of each picture line. // For example, if formats is RGB. linesize should has 3 valid values equaling to `image_width * 3`. And data will point to three arrays containing rgb values. // But I guess we'll need swscale() to convert pixel format here. From RGB to yuv420p (or other yuv family formats). Pkt = new AVPacket(); av_init_packet(Pkt); VideoSt.getFrame().pts(nextpts++); avcodec_encode_video2(VideoSt.getCodecCtx(), Pkt, VideoSt.getFrame(), got_output); av_packet_rescale_ts(Pkt, VideoSt.getCodecCtx().time_base(), VideoSt.getStream().time_base()); Pkt.stream_index(VideoSt.getStream().index()); av_interleaved_write_frame(FmtCtx, Pkt); av_packet_unref(Pkt); } // get delayed frames for (got_output[0] = 1; got_output[0] != 0;) { Pkt = new AVPacket(); av_init_packet(Pkt); avcodec_encode_video2(VideoSt.getCodecCtx(), Pkt, null, got_output); if (got_output[0] > 0) { av_packet_rescale_ts(Pkt, VideoSt.getCodecCtx().time_base(), VideoSt.getStream().time_base()); Pkt.stream_index(VideoSt.getStream().index()); av_interleaved_write_frame(FmtCtx, Pkt); } av_packet_unref(Pkt); } // free c structs avcodec_free_context(VideoSt.getCodecCtx()); av_frame_free(VideoSt.getFrame()); avio_closep(FmtCtx.pb()); avformat_free_context(FmtCtx); } } 

To port C code, you usually need to make a few changes:

  • Basically, the job is to replace each access to an element of the C structure ( . And -> ) with java getter / setter.
  • There are also many C & address operators, just delete them.
  • Change the C NULL macro and C ++ nullptr pointer to a null Java object.
  • Codes used to check the result of a bool of type int in if, for, while . You need to compare them with 0 in java.

And there may be other API changes, if referencing javacpp-presets docs , this will be fine.

Please note that here I have skipped all error handling codes. This may be required in real development / production.

+1
source

You can quickly convert Bitmap (or byte []) to YUV format using renderscript (see fooobar.com/questions/386374 / ... ). You can transfer these YUV frames to the ffmpeg library (as suggested by halfelf ) or use the built-in native MediaCodec , which uses dedicated hardware on model devices (but the compression parameters are less flexible than all FFmpeg software).

+1
source

I don’t really want to publish, but using pkzip and its SDK may be good

decision. Pkzip compresses the file to 95%, as they say.

The Smartcrypt SDK is available in all major programming languages, including C ++, Java and C #, and can be used to encrypt both structured and unstructured data. Changes to existing applications typically consist of two or three lines of code.

-1
source

Source: https://habr.com/ru/post/1258018/


All Articles