Missing image metadata when saving updated image in PhotoKit

I had a problem updating the image metadata and saving them in the Photos library. Everything works, except that the image metadata after changing it skips the records that were there before, and I don’t get any errors when manipulating the image or executing a block for changing the photo library. In addition, the dictionary before that, written back to the image, looks like the original plus my dictionary in the debugger.

My questions:

  • What am I doing wrong that will overwrite existing properties back with additional data in order to destroy what is there?
  • Is there a better, more canonical way to do this? Many mechanics seem to update some of the metadata in the image. It just looks like What does the rest.

EDIT:

Before saving, all Exif and Tiff values ​​are present. This is the complete metadata after saving photos using the code below:

["PixelHeight": 2448, "PixelWidth": 3264, "{Exif}": { ColorSpace = 1; PixelXDimension = 3264; PixelYDimension = 2448;}, "Depth": 8, "ProfileName": sRGB IEC61966-2.1, "Orientation": 1, "{TIFF}": { Orientation = 1;}, "ColorModel": RGB, "{JFIF}": { DensityUnit = 0; JFIFVersion = ( 1, 0, 1 ); XDensity = 72; YDensity = 72;}] 

Code, all in Swift 3, testing on iOS 10.1

The main workflow:

  // Get a mutable copy of the existing Exif meta let mutableMetaData = getMutableMetadataFrom(imageData: data) // Check to see if it has the {GPS} entry, if it does just exit. if let _ = mutableMetaData[kCGImagePropertyGPSDictionary as String] { callback(imageAsset, true, nil) return } // Add the {GPS} tag to the existing metadata let clLocation = media.location!.asCLLocation() mutableMetaData[kCGImagePropertyGPSDictionary as String] = clLocation.asGPSMetaData() // Attach the new metadata to the existing image guard let newImageData = attach(metadata: mutableMetaData, toImageData: data) else { callback(imageAsset, false, nil) return } let editingOptions = PHContentEditingInputRequestOptions() imageAsset.requestContentEditingInput(with: editingOptions) { editingInput, info in guard let editingInput = editingInput else { return } let library = PHPhotoLibrary.shared() let output = PHContentEditingOutput(contentEditingInput: editingInput) output.adjustmentData = PHAdjustmentData(formatIdentifier: "Project", formatVersion: "0.1", data: "Location Adjustment".data(using: .utf8)!) do { try newImageData.write(to: output.renderedContentURL, options: [.atomic]) } catch { callback(imageAsset, false, error) return } library.performChanges({ let changeRequest = PHAssetChangeRequest(for: imageAsset) changeRequest.location = clLocation changeRequest.contentEditingOutput = output }, completionHandler: { success, error in ... ... 

Helper methods for the workflow:

 func attach(metadata: NSDictionary, toImageData imageData:Data) -> Data? { guard let imageDataProvider = CGDataProvider(data: imageData as CFData), let cgImage = CGImage(jpegDataProviderSource: imageDataProvider, decode: nil, shouldInterpolate: true, intent: .defaultIntent), let newImageData = CFDataCreateMutable(nil, 0), let type = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, "image/jpg" as CFString, kUTTypeImage), let destination = CGImageDestinationCreateWithData(newImageData, (type.takeRetainedValue()), 1, nil) else { return nil } CGImageDestinationAddImage(destination, cgImage, metadata as CFDictionary) CGImageDestinationFinalize(destination) guard let newProvider = CGDataProvider(data: newImageData), let newCGImage = CGImage(jpegDataProviderSource: newProvider, decode: nil, shouldInterpolate: false, intent: .defaultIntent) else { return nil } return UIImageJPEGRepresentation(UIImage(cgImage: newCGImage), 1.0) } func getMutableMetadataFrom(imageData data : Data) -> NSMutableDictionary { let imageSourceRef = CGImageSourceCreateWithData(data as CFData, nil) let currentProperties = CGImageSourceCopyPropertiesAtIndex(imageSourceRef!, 0, nil) let mutableDict = NSMutableDictionary(dictionary: currentProperties!) return mutableDict } 

Also asGPSMetaData is an extension to CLLocation , what the Swift 3 version of this Gist looks like

+6
source share
1 answer

It turns out that this is not a manipulation of images or metadata with CoreGraphics, which was a problem at all, these were a few things I missed:

  • Building a UIImage from data containing EXIF ​​or GPS data deletes that data ... in fact, it deletes most of the metadata except the main JFIF data set and size (when using JPEG). This makes sense in retrospect, as their internal representation will be just raw data. However, I did not find an explicit statement about metadata in the documents.
  • Given the previous point, the main two ways to get a Data object (image with metadata) in the Photos library were either to write it to a temporary file and read it using PHAssetChangeRequest::creationRequestForAssetFromImage(atFileURL:) or use PHAssetCreationRequest::forAsset to create a request for creating and then use PHAssetCreationRequest::addResource(with:data:options:) to add data as a photo. I chose the latter because it was less movable parts.

So, I think all of this replaces the pretty, concise ALAssetsLibrary::writeImage(toSavedPhotosAlbum:metadata:completionBlock:) .

The final change block for the photo library was as follows:

  var assetID: String? PHPhotoLibrary.shared().performChanges({ let creationRequest = PHAssetCreationRequest.forAsset() creationRequest.addResource(with: .photo, data: imageWithMetaData as Data, options: nil) creationRequest.location = clLocation assetID = creationRequest.placeholderForCreatedAsset?.localIdentifier }) { success, error in ... 
+5
source

Source: https://habr.com/ru/post/1013199/


All Articles