AVCaptureSession and camera streams do not close [iOS]

Problem
Themes created during my AVCaptureSession do not close when I stop starting AVCaptureSession.

Symptoms
Usually my dispatch_queue, which receives frames from the camera, starts instantly. But after about four times opening and closing the ViewController, which opens / closes the AVCaptureSession, dispatch_queue takes about ten seconds to start.

Forecast
It seems that the streams associated with AVCaptureSession are not cleared.

After closing AVCaptureSession, I see that these threads remain:

com.apple.coremedia.capturesource.connections(serial) 1 Pending Block
com.apple.coremedia.capturesession.connections(serial) 1 Pending Block
<AVCMNotificationDispatcher: 0x16bce00> serial queue(serial) 4 Pending Blocks
com.apple.avfoundation.videocapturedevice.observed_properties_queue(serial)
com.apple.tcc.cache_queue(serial) 1 Pending Block
com.apple.tcc.preflight.kTCCServiceCamera(serial) 1 Pending Block

And after I open / close the ViewController using AVCaptureSession, the same threads remain, but these three threads have increased the number of waiting blocks

<AVCMNotificationDispatcher: 0x17c441a0> serial queue (serial) 9 Pending Blocks
com.apple.avfoundation.videocapturedevice.observed_properties_queue(serial)
com.apple.tcc.preflight.kTCCServiceCamera(serial)  5 Pending Blocks

Code setting

VideoSource.h VideoSource.mm

ViewController :

self.videoSource = [[VideoSource alloc] init];
self.videoSource.delegate = self;
[self.videoSource setResolution:AVCaptureSessionPreset352x288]; // was 640
[self.videoSource startWithDevicePosition:AVCaptureDevicePositionFront];

captureSession , . .

    [self.videoSource.captureSession startRunning];
    [self.videoSource.captureSession stopRunning];

VideoSource, , , .

VideoSource.mm

- (void)dealloc {
NSLog(@"Cleaning Up Video Source");
[_captureSession stopRunning];

AVCaptureInput* input = [_captureSession.inputs objectAtIndex:0];
[_captureSession removeInput:input];
input = nil;

AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[_captureSession.outputs objectAtIndex:0];
[_captureSession removeOutput:output];
output = nil;

_captureSession = nil;
_deviceInput = nil;
_delegate = nil;

//  [super dealloc]; // compiler handles this for you with ARC
}


- (void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init ];
//    captureOutput.alwaysDiscardsLateVideoFrames = YES;

NSLog(@"Create Dispatch Queue");


// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t queue;
queue = dispatch_queue_create("com.name.test", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:queue];

//    dispatch_release(queue); // compiler handles this for you with ARC

// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = @{key:value};

NSLog(@"Set Video Settings");

[captureOutput setVideoSettings:settings];

NSLog(@"Always Discard Late Video Frames");

[captureOutput setAlwaysDiscardsLateVideoFrames:YES];
// (4) Configure the output port on the captureSession property

[self.captureSession addOutput:captureOutput];
}

VideoSource.h

@interface VideoSource : NSObject

@property (nonatomic, strong) AVCaptureSession * captureSession;  
@property (nonatomic, strong) AVCaptureDeviceInput * deviceInput;
@property (nonatomic, weak) id<VideoSourceDelegate> delegate;

- (BOOL)startWithDevicePosition:(AVCaptureDevicePosition)devicePosition;
- (void) setResolution:(NSString*)resolution;

@end

, , VideoSource?

+4
1

!

: startRunning stopRunning dispatch_queue, SampleBuffer captureOutput.

:

#import "VideoSource.h"

@interface VideoSource () <AVCaptureVideoDataOutputSampleBufferDelegate>

// Session management.
@property (nonatomic) dispatch_queue_t sessionQueue;
@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) AVCaptureDeviceInput *deviceInput;

/*@property (nonatomic, strong) AVCaptureSession * captureSession;
@property (nonatomic, strong) AVCaptureDeviceInput * deviceInput; */

@end

@implementation VideoSource


-(id) init{
    if(self = [super init]){
        self.captureSession = [[AVCaptureSession alloc] init];
        self.sessionQueue = dispatch_queue_create( "session queue", DISPATCH_QUEUE_SERIAL );

    }
    return self;
}

sessionQueue setSampleBufferDelegate.

[captureOutput setSampleBufferDelegate:self queue:self.sessionQueue];

startRunning/stopRunning SAME:

dispatch_async( self.sessionQueue, ^{
    [self.captureSession startRunning];

});

, , captureSession:

-(void)closeCaptureSession {

     dispatch_async(self.sessionQueue, ^{

         if([_captureSession isRunning])[_captureSession stopRunning];

         [_captureSession stopRunning];

         // Remove all inputs
         for(AVCaptureInput *input1 in _captureSession.inputs) {
             [_captureSession removeInput:input1];
         }

         // Remove all outputs
         for(AVCaptureVideoDataOutput *output1 in _captureSession.outputs) {
             [output1 setSampleBufferDelegate:nil queue:NULL];
             [_captureSession removeOutput:output1];
         }

         // Set to Nil to make ARC job a little easier
         self.captureSession = nil;
         self.deviceInput = nil;
         self.delegate = nil;
         self.sessionQueue=nil;
     });

}
+2

Source: https://habr.com/ru/post/1614323/


All Articles