Applying CIFilter to a video file and saving it

Is there any quick, easy way to apply CIFilterto videos? Before he mentioned, I looked at GPUImage - it looks like very powerful magic code, but it really went too far for what I'm trying to do.

Essentially, I would like

  • Take a video file, say, saved in /tmp/myVideoFile.mp4
  • Apply CIFilterto this video file
  • Save the video file in another (or the same) place, say /tmp/anotherVideoFile.mp4

I was able to apply CIFilter to a video that plays very easily and quickly with AVPlayerItemVideoOutput

let player = AVPlayer(playerItem: AVPlayerItem(asset: video))
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: nil)
player.currentItem?.addOutput(self.output)
player.play()

let displayLink = CADisplayLink(target: self, selector: #selector(self.displayLinkDidRefresh(_:)))
displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSRunLoopCommonModes)

func displayLinkDidRefresh(link: CADisplayLink){
    let itemTime = output.itemTimeForHostTime(CACurrentMediaTime())
    if output.hasNewPixelBufferForItemTime(itemTime){
        if let pixelBuffer = output.copyPixelBufferForItemTime(itemTime, itemTimeForDisplay: nil){
            let image = CIImage(CVPixelBuffer: pixelBuffer)
            // apply filters to image
            // display image
        }
    }
}

, . , , AVPlayer, , . , , .

- :

var newVideo = AVMutableAsset() // We'll just pretend like this is a thing

var originalVideo = AVAsset(url: NSURL(urlString: "/example/location.mp4"))
originalVideo.getAllFrames(){(pixelBuffer: CVPixelBuffer) -> Void in
    let image = CIImage(CVPixelBuffer: pixelBuffer)
        .imageByApplyingFilter("Filter", withInputParameters: [:])

    newVideo.addFrame(image)
}

newVideo.exportTo(url: NSURL(urlString: "/this/isAnother/example.mp4"))

( , GPUImage iOS 7), , ? , , AVAsset, CIFilter, .

+5
1

iOS 9/OS X 10.11/tvOS CIFilter . AVVideoComposition, , / . . AVVideoComposition.init(asset:applyingCIFiltersWithHandler:) .

Apple Core:

let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in

    // Clamp to avoid blurring transparent pixels at the image edges
    let source = request.sourceImage.clampingToExtent()
    filter.setValue(source, forKey: kCIInputImageKey)

    // Vary filter parameters based on video timing
    let seconds = CMTimeGetSeconds(request.compositionTime)
    filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)

    // Crop the blurred output to the bounds of the original image
    let output = filter.outputImage!.cropping(to: request.sourceImage.extent)

    // Provide the filter output to the composition
    request.finish(with: output, context: nil)
})

. , , , AVPlayer AVAssetExportSession. , :

let export = AVAssetExportSession(asset: asset, presetName: AVAssetExportPreset1920x1200)
export.outputFileType = AVFileTypeQuickTimeMovie
export.outputURL = outURL
export.videoComposition = composition

export.exportAsynchronouslyWithCompletionHandler(/*...*/)

WWDC15 Core Image, 20 .


, , .

: , . 15 2016 , 87% iOS 9.0 , 97% iOS 8.0 . - , , - , .

. , CVPixelBuffer, , CIImage , CVPixelBuffer s.

AVVideoComposition , , .

+11

Source: https://habr.com/ru/post/1652275/


All Articles