I try to save sequences of images with fixed frames (preferably up to 30) on an Android device with full capability for camera2 (Galaxy S7), but I can not: a) get a constant frame rate, b) even reach 20 frames per second (with jpeg encoding ) I have already included offers from Android camera2 capture package too slow .
The minimum frame length for JPEG is 33.33 milliseconds (for resolutions below 1920x1080) according to
characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputMinFrameDuration(ImageFormat.JPEG, size);
and stallduration is 0ms for each size (similar for YUV_420_888).
My capture creator is as follows:
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CONTROL_AE_MODE_OFF); captureBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, _exp_time); captureBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true); captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, _iso_value); captureBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, _foc_dist); captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CONTROL_AF_MODE_OFF); captureBuilder.set(CaptureRequest.CONTROL_AWB_MODE, _wb_value);
Focus distance is set to 0.0 (inf), iso is set to 100, exposure time is 5 ms. The balance can be set to OFF / AUTO / ANY VALUE, this does not affect the time below.
I start a capture session with the following command:
session.setRepeatingRequest(_capReq.build(), captureListener, mBackgroundHandler);
Note. It doesnโt matter if I request RepeatingRequest or RepeatingBurst ..
In the preview (only the attached texture surface) everything is at a speed of 30 frames per second. However, as soon as I attach the image reader (the listener runs on HandlerThread), which I create as shown below (without saving, only the measurement time between frames):
reader = ImageReader.newInstance(_img_width, _img_height, ImageFormat.JPEG, 2); reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
With time measurement code:
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() { @Override public void onImageAvailable(ImageReader myreader) { Image image = null; image = myreader.acquireNextImage(); if (image == null) { return; } long curr = image.getTimestamp(); Log.d("curr- _last_ts", "" + ((curr - last_ts) / 1000000) + " ms"); last_ts = curr; image.close(); } }
I periodically repeat time differences such as this:
99 ms - 66 ms - 66 ms - 99 ms - 66 ms - 66 ms ...
I donโt understand why they take a double or triple moment when the stream configuration card is advertised for jpeg? The exposure time is much shorter than the frame duration of 33 ms. Is there some other internal processing that I don't know about?
I tried the same for the YUV_420_888 format, which led to constant time differences of 33 ms. The problem I have here is that the cell phone does not have enough bandwidth to store images quickly (I tried the method described in How to save the YUV_420_888 image? ). If you know any method of compressing or encoding these images fast enough, please let me know.
Edit: from the documentation getOutputStallDuration: โIn other words, using a repeating YUV request will result in a steady frame rate (say 30 FPS). If one JPEG request is sent periodically, the frame rate will remain at 30 FPS (as long as we expect the previous JPEG will be returned every time.) If we try to send a repeating YUV + JPEG request, then the frame rate will drop from 30 FPS. " Does this mean that I need to periodically request one capture ()?
Edit2: From https://developer.android.com/reference/android/hardware/camera2/CaptureRequest.html : "The necessary information for the application, taking into account the above model, is provided through android.scaler.streamConfigurationMap using getOutputMinFrameDuration (int, Size ). They are used to determine the maximum frame rate / minimum frame duration that is possible for a given stream configuration.
In particular, an application can use the following rules to determine the minimum frame length that it can request from a camera device:
Let the set of currently configured I / O streams be called S. Find the minimum frame durations for each stream in S by looking at it in the android.scaler.streamConfigurationMap file using getOutputMinFrameDuration (int, Size) (with the appropriate size / format). Let this set of frame durations be called F. For any given query R, โโthe minimum frame duration allowed for R is the maximum of all values โโin F. Let the streams used in R be called S_r. If none of the threads in S_r has a stop time (specified in getOutputStallDuration (int, Size) using the appropriate size / format), then the frame duration in F determines the frame rate in steady state that the application will receive if it uses R as repeating request.