Generated distorted video from a regular CGImageRef image after applyPreferredTrackTransform is set

I'm trying to get an image from a video and then use this image to create a still movie. The first step works fine, but the second step generated a distorted video after I set applyPreferredTrackTransform = true

normal image extracted from video normal image extracted from the video distorted video created from image malformed video generated from the image

How did this happen? Has a normal image created a distorted video? in addition, if I put the GenerateMovieFromImage.generateMovieWithImage block in # 2, the application will be split into CGContextDrawImage (context, CGRectMake (0, 0, frameSize.width, frameSize.height), image);

I did the following (fast):

var asset: AVAsset = AVAsset.assetWithURL(self.tmpMovieURL!) as AVAsset var imageGen: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset) var time: CMTime = CMTimeMake(0, 60) imageGen.appliesPreferredTrackTransform = true imageGen.generateCGImagesAsynchronouslyForTimes( [ NSValue(CMTime:time) ], completionHandler: { (requestTime, image, actualTime, result, error) -> Void in if result == AVAssetImageGeneratorResult.Succeeded { ALAssetsLibrary().writeImageToSavedPhotosAlbum(image, metadata: nil, completionBlock: { (nsurl, error) in // #2 }) GenerateMovieFromImage.generateMovieWithImage(image, completionBlock:{ (genMovieURL) in handler(genMovieURL) }) 

GenerateMovieFromImage.generateMovieWithImage was from This answer

 + (void)generateMovieWithImage:(CGImageRef)image completionBlock:(GenerateMovieWithImageCompletionBlock)handler { NSLog(@"%@", image); NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent: [@"tmpgen" stringByAppendingPathExtension:@"mov" ] ]; NSURL *videoUrl = [NSURL fileURLWithPath:path]; if ([[NSFileManager defaultManager] fileExistsAtPath:path] ) { NSError *error; if ([[NSFileManager defaultManager] removeItemAtPath:path error:&error] == NO) { NSLog(@"removeitematpath %@ error :%@", path, error); } } // TODO: image need to rotate programly, not in hand int width = (int)CGImageGetWidth(image); int height = (int)CGImageGetHeight(image); NSError *error = nil; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:videoUrl fileType:AVFileTypeQuickTimeMovie error:&error]; NSParameterAssert(videoWriter); NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:width], AVVideoWidthKey, [NSNumber numberWithInt:height], AVVideoHeightKey, nil]; AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ; //retain should be removed if ARC NSParameterAssert(writerInput); NSParameterAssert([videoWriter canAddInput:writerInput]); [videoWriter addInput:writerInput]; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil ]; // 2) Start a session: NSLog(@"start session"); [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; //use kCMTimeZero if unsure dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL); [writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{ if ([writerInput isReadyForMoreMediaData]) { // 3) Write some samples: // Or you can use AVAssetWriterInputPixelBufferAdaptor. // That lets you feed the writer input data from a CVPixelBuffer // that's quite easy to create from a CGImage. CVPixelBufferRef sampleBuffer = [self newPixelBufferFromCGImage:image]; if (sampleBuffer) { CMTime frameTime = CMTimeMake(150,30); [adaptor appendPixelBuffer:sampleBuffer withPresentationTime:kCMTimeZero]; [adaptor appendPixelBuffer:sampleBuffer withPresentationTime:frameTime]; CFRelease(sampleBuffer); } } // 4) Finish the session: [writerInput markAsFinished]; [videoWriter endSessionAtSourceTime:CMTimeMakeWithSeconds(5, 30.0) ] ; //optional can call finishWriting without specifiying endTime // [videoWriter finishWriting]; //deprecated in ios6 NSLog(@"to finnish writing"); [videoWriter finishWritingWithCompletionHandler:^{ NSLog(@"%@",videoWriter); NSLog(@"finishWriting.."); handler(videoUrl); ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init]; [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:path] completionBlock: ^(NSURL *assetURL, NSError *error){ if( error != nil) { NSLog(@"writeVideoAtPathToSavedPhotosAlbum error: %@" , error); } }]; }]; //ios 6.0+ }]; } + (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef)image { NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; CVPixelBufferRef pxbuffer = NULL; CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image) ); NSLog(@"width:%f", frameSize.width); NSLog(@"height:%f", frameSize.height); CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options, &pxbuffer); NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); CVPixelBufferLockBaseAddress(pxbuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); NSParameterAssert(pxdata != NULL); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width, frameSize.height, 8, 4*frameSize.width, rgbColorSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst ); NSParameterAssert(context); CGContextConcatCTM(context, CGAffineTransformIdentity); CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pxbuffer, 0); return pxbuffer; } 
+6
source share
3 answers

Try the following:

 NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary ]; 

I find your problem:

 (requestTime, image, actualTime, result, error) -> Void in if result == AVAssetImageGeneratorResult.Succeeded { let img : UIImage = UIImage(CGImage: image)! // retain UIImageWriteToSavedPhotosAlbum(img,nil,nil,nil) // synchron GenerateMovieFromImage.generateMovieWithImage(image, completionBlock:{ (genMovieURL) in handler(genMovieURL) }) 

I check all the work. If you still have problems, the problem is in ur device.

+1
source

It seems that I got some improvement.

In function (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef) image

the change:

 NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; 

in

 NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, [NSNumber numberWithInt:4*frameSize.width], kCVPixelBufferBytesPerRowAlignmentKey,nil]; 

The photo is not distorted . but apparently it scales in its y-string to 0.5 as its normal height!

I am still working on a solution to this problem.


update:

I completely solved this problem: This is because the transform attribute and preformTransform

 The transform specified in the track's storage container as the preferred 

conversion of visual media for display purposes.

This means that the actual orientation of the video file may not be the same as you see during playback.

AVPlayer uses the conversion in this file for playback. However, generateCGImagesAsynchronouslyForTimes ignores this attribute. and get a landscape (for example). therefore, you will need to set the conversion back to the original video file.

Just add it before.

 writerInput.transform = CGAffineTransformMakeRotation(M_PI_2) ; 

front

 AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ; //retain should be removed if ARC 

will solve this problem.

+1
source

I also got this error when I tried to create a video with an array of images and a music file. This is due to the frame ratio of the video. Therefore, check the frame for composing the video. For reference: http://size43.com/jqueryVideoTool.html

+1
source

Source: https://habr.com/ru/post/979436/


All Articles