Orientation of UIImage when iPhone is on the Z-axis

I am using code similar to the code found on this blog http://blog.logichigh.com/2008/06/05/uiimage-fix/ to rotate the images after I took them with the iPhone camera. I am using AVFoundation .

I extracted the relevant code here:

  case UIImageOrientationUp: //EXIF = 1 transform = CGAffineTransformIdentity; break; case UIImageOrientationUpMirrored: //EXIF = 2 transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0); transform = CGAffineTransformScale(transform, -1.0, 1.0); break; case UIImageOrientationDown: //EXIF = 3 transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height); transform = CGAffineTransformRotate(transform, M_PI); break; case UIImageOrientationDownMirrored: //EXIF = 4 transform = CGAffineTransformMakeTranslation(0.0, imageSize.height); transform = CGAffineTransformScale(transform, 1.0, -1.0); break; case UIImageOrientationLeftMirrored: //EXIF = 5 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width); transform = CGAffineTransformScale(transform, -1.0, 1.0); transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0); break; case UIImageOrientationLeft: //EXIF = 6 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeTranslation(0.0, imageSize.width); transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0); break; case UIImageOrientationRightMirrored: //EXIF = 7 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeScale(-1.0, 1.0); transform = CGAffineTransformRotate(transform, M_PI / 2.0); break; case UIImageOrientationRight: //EXIF = 8 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0); transform = CGAffineTransformRotate(transform, M_PI / 2.0); break; 

This works great when the phone is held on the X or Y axis.

However, when I hold the phone on the Z axis. It always shows that UIImage has EXIF = 2 .

I know that I can use the accelerometer to find out when the device is on the Z axis. However, I cannot see the path that will lead me to the difference between the images when they are taken with this tagged one, since all of them still have EXIF = 2 .

i.e. This will allow me to distinguish between photographs taken on Z. But this will not allow me to distinguish between photographs themselves, for example. Landscape1 (iPhone Home button on the left, portrait, Landscape2 (iPhone Home button on the right) enter image description here

+6
source share
2 answers

EXIF data only conveys the XY orientation. There is nothing in EXIF โ€‹โ€‹data that tells you whether the camera was pointing up or down. You can capture the orientation of the device during image capture:

 [[UIDevice currentDevice] orientation] 

Then you just need to track the image and its up / down orientation separately from the EXIF โ€‹โ€‹data. If you save images in your application, a simple database table or even a serialized NSDictionary with the image name as the key and up / down orientation as the value will be used.

0
source

I had a similar problem. I positioned the views on the screen in certain positions depending on the orientation of the device. However, when the device was "Laying flat, without visibility in landscape or portrait respects, the orientation with the [[UIDevice currentDevice] ] orientation was unreliable. I decided to solve this using [[[UIApplication] sharedApplication] statusBarOrientation] , which will always return the current the orientation in which the status bar is displayed on the screen.I have more experience working with iOS apps using Xamarin with C #, so forgive me if my ObjectiveC is not working a bit.

0
source

Source: https://habr.com/ru/post/948553/


All Articles