I am using code similar to the code found on this blog http://blog.logichigh.com/2008/06/05/uiimage-fix/ to rotate the images after I took them with the iPhone camera. I am using AVFoundation .
I extracted the relevant code here:
case UIImageOrientationUp: //EXIF = 1 transform = CGAffineTransformIdentity; break; case UIImageOrientationUpMirrored: //EXIF = 2 transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0); transform = CGAffineTransformScale(transform, -1.0, 1.0); break; case UIImageOrientationDown: //EXIF = 3 transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height); transform = CGAffineTransformRotate(transform, M_PI); break; case UIImageOrientationDownMirrored: //EXIF = 4 transform = CGAffineTransformMakeTranslation(0.0, imageSize.height); transform = CGAffineTransformScale(transform, 1.0, -1.0); break; case UIImageOrientationLeftMirrored: //EXIF = 5 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width); transform = CGAffineTransformScale(transform, -1.0, 1.0); transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0); break; case UIImageOrientationLeft: //EXIF = 6 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeTranslation(0.0, imageSize.width); transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0); break; case UIImageOrientationRightMirrored: //EXIF = 7 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeScale(-1.0, 1.0); transform = CGAffineTransformRotate(transform, M_PI / 2.0); break; case UIImageOrientationRight: //EXIF = 8 boundHeight = bounds.size.height; bounds.size.height = bounds.size.width; bounds.size.width = boundHeight; transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0); transform = CGAffineTransformRotate(transform, M_PI / 2.0); break;
This works great when the phone is held on the X or Y axis.
However, when I hold the phone on the Z axis. It always shows that UIImage has EXIF = 2 .
I know that I can use the accelerometer to find out when the device is on the Z axis. However, I cannot see the path that will lead me to the difference between the images when they are taken with this tagged one, since all of them still have EXIF = 2 .
i.e. This will allow me to distinguish between photographs taken on Z. But this will not allow me to distinguish between photographs themselves, for example. Landscape1 (iPhone Home button on the left, portrait, Landscape2 (iPhone Home button on the right) 