IOS: resize / crop recorded video

What would be the best way to resize user recorded video on iphone?

Currently, I have the ability to receive video in two ways:

1) Get the file url UIImagePickerControllerafter the video was recorded

2) Get the frames of the video as sending using AVCaptureSession

For 1) I will need something that can read .mov h.264 files and spit out individual frames. Is there such a thing?

For 2) I was thinking about getting a UIImage for each frame, resizing the image and then recompiling the video with something like AVAssetWriter. But this seems like a very intensive operation, and I wonder if there is a better way to approach this problem.

Is the general idea of ​​resizing video to resize each individual frame and then recompiling the video? Or is there a way to directly resize the entire video?

I just need the user to record a video, and then resize this video to 320x320, without fancy editing.

Thanks for any help you can provide.

Edit: Maybe “resize” is the wrong word. I just need to trim the video so that it is from 480x360 to 320x320. Also this cropping is not required in real time, I can do it as soon as the video has been recorded.

+3
source share
4 answers

Alternative 2 will not work in real time.

1 AVAssetExportSession, , AVAssetExportPreset640x480.

AVAssetExportPresetLowQuality, AVAssetExportPresetMediumQuality, AVAssetExportPresetHighestQuality .

, 16: 9 ( 4: 3), 320x320 . , , , .

+3

- , , AVFoundation, AVMutableVideoComposition, AVMutableComposition. renderSize , . AVMutableVideoCompositionLayerInstruction , . , .

+3

- AVAssetReader AVAssetWriter . :

NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:
                                  AVVideoCodecH264, AVVideoCodecKey,
                                  [NSNumber numberWithInt: 480], AVVideoWidthKey,
                                  [NSNumber numberWithInt: 480], AVVideoHeightKey,
                                      AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey,
                                  nil];
        NSLog(@"%@",[assetVideoTrack mediaType]);
assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType: [assetVideoTrack mediaType]
                                                                   outputSettings:settings];
        [assetWriter addInput:assetWriterVideoInput];

Apple, guid AVAssetReader

+2

. Swift 2. ,

extension AVAsset {

  private var g_naturalSize: CGSize {
    return tracksWithMediaType(AVMediaTypeVideo).first?.naturalSize ?? .zero
  }

  var g_correctSize: CGSize {
    return g_isPortrait ? CGSize(width: g_naturalSize.height, height: g_naturalSize.width) : g_naturalSize
  }

  var g_isPortrait: Bool {
    let portraits: [UIInterfaceOrientation] = [.Portrait, .PortraitUpsideDown]
    return portraits.contains(g_orientation)

  // Same as UIImageOrientation
  var g_orientation: UIInterfaceOrientation {
    guard let transform = tracksWithMediaType(AVMediaTypeVideo).first?.preferredTransform else {
      return .Portrait
    }

    switch (transform.tx, transform.ty) {
    case (0, 0):
      return .LandscapeRight
    case (g_naturalSize.width, g_naturalSize.height):
      return .LandscapeLeft
    case (0, g_naturalSize.width):
      return .PortraitUpsideDown
    default:
      return .Portrait
    }
  }
}

func transform(avAsset: AVAsset, scaleFactor: CGFloat) -> CGAffineTransform {
    let offset: CGPoint
    let angle: Double

    switch avAsset.g_orientation {
    case .LandscapeLeft:
      offset = CGPoint(x: avAsset.g_correctSize.width, y: avAsset.g_correctSize.height)
      angle = M_PI
    case .LandscapeRight:
      offset = CGPoint.zero
      angle = 0
    case .PortraitUpsideDown:
      offset = CGPoint(x: 0, y: avAsset.g_correctSize.height)
      angle = -M_PI_2
    default:
      offset = CGPoint(x: avAsset.g_correctSize.width, y: 0)
      angle = M_PI_2
    }

    let scale = CGAffineTransformMakeScale(scaleFactor, scaleFactor)
    let translation = CGAffineTransformTranslate(scale, offset.x, offset.y)
    let rotation = CGAffineTransformRotate(translation, CGFloat(angle))

    return rotation
  }

layer

let layer = AVMutableVideoCompositionLayerInstruction(assetTrack: track)
layer.setTransform(transform(avAsset, scaleFactor: 0.8), atTime: kCMTimeZero)

0

Source: https://habr.com/ru/post/1792105/


All Articles