I am trying to get around this problem with no luck. I have a very simple Swift command-line application that takes one argument - the path to the image to load. It processes the image and filters the image fragment using the SepiaTone filter.
It works great. It crop the image to 200x200 and filters it using SepiaTone. Now here is the problem I am facing - the whole process takes 600 ms on my MacBook Air. Now, when I change (instead of cropping) the input image to the same size (200x200), it takes 150 ms .
Why? In both cases, I filter the image with a size of 200x200 . I am using this specific image for testing (5966x3978).
UPDATE:
This is a specific line of code that takes 4 times as much when dealing with a cropped image:
var ciImage:CIImage = CIImage(cgImage: cgImage)
END OF UPDATE
Code for trimming (200x200):
// parse args and get image path let args:Array = CommandLine.arguments let inputFile:String = args[CommandLine.argc - 1] let inputURL:URL = URL(fileURLWithPath: inputFile) // load the image from path into NSImage // and convert NSImage into CGImage guard let nsImage = NSImage(contentsOf: inputURL), var cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) else { exit(EXIT_FAILURE) } // CROP THE IMAGE TO 200x200 // THIS IS THE ONLY BLOCK OF CODE THAT IS DIFFERENT // IN THOSE TWO EXAMPLES let rect = CGRect(x: 0, y: 0, width: 200, height: 200) if let croppedImage = cgImage.cropping(to: rect) { cgImage = croppedImage } else { exit(EXIT_FAILURE) } // END CROPPING // convert CGImage to CIImage var ciImage:CIImage = CIImage(cgImage: cgImage) // initiate SepiaTone guard let sepiaFilter = CIFilter(name: "CISepiaTone") else { exit(EXIT_FAILURE) } sepiaFilter.setValue(ciImage, forKey: kCIInputImageKey) sepiaFilter.setValue(0.5, forKey: kCIInputIntensityKey) guard let result = sepiaFilter.outputImage else { exit(EXIT_FAILURE) } let context:CIContext = CIContext() // perform filtering in a GPU context guard let output = context.createCGImage(sepiaFilter.outputImage!, from: ciImage.extent) else { exit(EXIT_FAILURE) }
Code for resizing (200x200):
// parse args and get image path let args:Array = CommandLine.arguments let inputFile:String = args[CommandLine.argc - 1] let inputURL:URL = URL(fileURLWithPath: inputFile) // load the image from path into NSImage // and convert NSImage into CGImage guard let nsImage = NSImage(contentsOf: inputURL), var cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) else { exit(EXIT_FAILURE) } // RESIZE THE IMAGE TO 200x200 // THIS IS THE ONLY BLOCK OF CODE THAT IS DIFFERENT // IN THOSE TWO EXAMPLES guard let CGcontext = CGContext(data: nil, width: 200, height: 200, bitsPerComponent: cgImage.bitsPerComponent, bytesPerRow: cgImage.bytesPerRow, space: cgImage.colorSpace ?? CGColorSpaceCreateDeviceRGB(), bitmapInfo: cgImage.bitmapInfo.rawValue) else { exit(EXIT_FAILURE) } CGcontext.draw(cgImage, in: CGRect(x: 0, y: 0, width: 200, height: 200)) if let resizeOutput = CGcontext.makeImage() { cgImage = resizeOutput } // END RESIZING // convert CGImage to CIImage var ciImage:CIImage = CIImage(cgImage: cgImage) // initiate SepiaTone guard let sepiaFilter = CIFilter(name: "CISepiaTone") else { exit(EXIT_FAILURE) } sepiaFilter.setValue(ciImage, forKey: kCIInputImageKey) sepiaFilter.setValue(0.5, forKey: kCIInputIntensityKey) guard let result = sepiaFilter.outputImage else { exit(EXIT_FAILURE) } let context:CIContext = CIContext() // perform filtering in a GPU context guard let output = context.createCGImage(sepiaFilter.outputImage!, from: ciImage.extent) else { exit(EXIT_FAILURE) }