I am currently developing an iOS application that applies CoreImage to a camera channel for taking photos and videos, and I ran into some problem.
So far, I have used AVCaptureVideoDataOutput to get trial buffers and manage them using CoreImage, and then displayed a simple preview, and also used it to capture photos and save them.
When I tried to make a video recording by writing SampleBuffers to the video, when I got them from AVCaptureVideoDataOutput , it had a very slow frame rate (probably due to another image related to the processing that was ongoing).
So, I was wondering: is it possible to have AVCaptureVideoDataOutput and AVCaptureMoveFileOutput simultaneously on the same AVCaptureSession?
I quickly switched to it and found that when I added additional output, my AVCaptureVideoDataOutput stopped receiving information.
If I can make it work, I hope this means that I can just use the second output to record video at a high frame rate and post-process the video after the user has stopped recording.
Any help would be greatly appreciated.
ios avfoundation avcapturesession
Grinneh Nov 09 2018-11-11T00: 00Z
source share