I have a crash when I try to take a picture (a camera with a front bezel), which only fails when the user uses the image in image mode for a separate video application. Everything works fine if the user does not have an image in the picture. The accident occurs on this line:
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
with an error
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] - inconsistent state.'
I tried to check if the phone could take pictures at all when using the image in image mode, but the default application for the iOS camera can take a picture (although it may use a different shooting method). stillImageOutputand videoConnectionseem customizable and not zero.
Here is the code leading to this crash if it helps.
avCaptureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice* cameraDevice = [GS60_FriendFeed_ScreenshotSelfie_Preview_View frontFacingCameraIfAvailable];
avCaptureSession.sessionPreset = avCaptureSessionPresetString;
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&error];
if (!input) {
NSLog(@"ERROR: trying to open camera: %@", error);
}
[avCaptureSession addInput:input];
AVCaptureStillImageOutput* stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[avCaptureSession addOutput:stillImageOutput];
[avCaptureSession startRunning];
and later
AVCaptureConnection* videoConnection = nil;
AVCaptureStillImageOutput* stillImageOutput = [[avCaptureSession outputs] objectAtIndex:0];
for (AVCaptureConnection* connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
UIInterfaceOrientation orientation = [UIApplication sharedApplication].statusBarOrientation;
AVCaptureVideoOrientation avcaptureOrientation = AVCaptureVideoOrientationPortrait;
if(orientation == UIInterfaceOrientationUnknown) {
avcaptureOrientation = AVCaptureVideoOrientationPortrait;
} else if(orientation == UIInterfaceOrientationPortrait) {
avcaptureOrientation = AVCaptureVideoOrientationPortrait;
} else if(orientation == UIInterfaceOrientationPortraitUpsideDown) {
avcaptureOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
} else if(orientation == UIInterfaceOrientationLandscapeLeft) {
avcaptureOrientation = AVCaptureVideoOrientationLandscapeLeft;
} else if(orientation == UIInterfaceOrientationLandscapeRight) {
avcaptureOrientation = AVCaptureVideoOrientationLandscapeRight;
}
[videoConnection setVideoOrientation:avcaptureOrientation];
[videoConnection setVideoMirrored:YES];
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
...
I would rather be able to take a photograph, but if this is not possible while the picture in the picture is open, knowing how to detect that we can’t take it, it will still be useful.
Thanks for any help.
source
share