You can also use Javacv (FFmpeg java wrapper) to combine audio and video. You will need to complete this task either in async or in a stream.
FrameGrabber videoGrabber = new FFmpegFrameGrabber(videoPath); Frame videoFrame=null; FrameRecorder mFrameRecorder; videoGrabber.start(); mFrameRecorder = new FFmpegFrameRecorder(OutputPath, videoGrabber .getImageWidth(), videoGrabber.getImageHeight(), 2); mFrameRecorder.setVideoQuality(1); mFrameRecorder.setFormat("mp4"); mFrameRecorder.setFrameRate(videoGrabber.getFrameRate()); mFrameRecorder.start(); while ((videoFrame = videoGrabber.grabFrame())!=null) { videoFrame = videoGrabber.grabFrame(); mFrameRecorder.record(videoFrame); } mFrameRecorder.stop(); mFrameRecorder.release(); videoGrabber.stop(); } catch (FrameGrabber.Exception e) { e.printStackTrace(); } catch (FrameRecorder.Exception e) { e.printStackTrace(); }
At the application level build.gradle
Add these dependencies
compile'org.bytedeco:javacv:1.0' compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '2.4.11-0.11', classifier: 'android-arm' compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.6.1-0.11', classifier: 'android-arm'
source share