I am writing an application that performs some real-time processing on images received from AVCaptureVideoDataOutput in AVCaptureSession.
Currently, I can start a session, add input and output, and then get the image data, convert it to UIImage and display it on the screen live.
The main problem that I am facing is that the orientation of the image is inconvenient. It rotates and mirrors, and it also looks distorted. I did some research on this, I found some related questions , ve tried the code that was suggested, but it does not fix the rotation problem.
I think the related questions suggest that UIImages came from somewhere else (maybe a higher-level api that adds more information, such as orientation, to the image automatically. Or maybe because I get it from the video output
I'm really not looking for code that fixes it (although an annotated example will be really useful), but rather a good explanation that the life cycle of the image obtained in this way works; What is the recommended way to handle this so that it appears on the screen in a way that makes sense for the orientation of the phone? What is the orientation of the CGImageRef that is being returned? and etc.
I have a previous question that uses the code that I use to configure AVCaptureSession.
ios camera avcapturesession
SuitedSloth Nov 30 '12 at 7:27 2012-11-30 07:27
source share