Fast and efficient implementation of Swift 3 for iOS 9/10. . I feel this is effective when we tried every image filtering method I could find to process 100 images at a time (when loading using AlamofireImage ImageFilter). I decided to use this method as FAR better than any other I tried (for my use) in terms of memory and speed.
func convertToGrayscale() -> UIImage? { UIGraphicsBeginImageContextWithOptions(self.size, false, self.scale) let imageRect = CGRect(x: 0.0, y: 0.0, width: self.size.width, height: self.size.height) let context = UIGraphicsGetCurrentContext()
Reusing colorDodge: I initially had problems getting my images light enough to match the gray color scheme created using CIFilter ("CIPhotoEffectTonal") - my results were too dark. I was able to get a decent match by using CGBlendMode.colorDodge @ ~ 0.7 alpha, which seems to increase the overall contrast.
Other color mixing effects may work as well, but I think you will want to apply it to brightness, which is a grayscale filtering effect. I found this page very useful for linking to different BlendModes .
Achieved performance ratio: I need to process 100 thumbnails downloaded from the server (using AlamofireImage to load, cache and apply a filter). I started to crash when the total size of my images exceeded the cache size, so I experimented with other methods.
CoreImage-based CoreImage-based CIFilter approach was the first that I tried and is not effective enough for the number of processed images.
I also tried applying CIFilter through the GPU using EAGLContext(api: .openGLES3) , which was actually even more memory intensive - in fact, I received memory warnings when using 450+ mb when loading 200+ images.
I tried processing a bitmap (i.e. CGContext(data: nil, width: width, height: height, bitsPerComponent: 8, bytesPerRow: 0, space: colorSpace, bitmapInfo: CGImageAlphaInfo.none.rawValue) ... that worked ok, except that I couldn't get a high enough resolution for a modern retina context.scaleBy(x: scaleFactor, y: scaleFactor) images were very grainy even when I added context.scaleBy(x: scaleFactor, y: scaleFactor) .
Thus, from all that I tried, this method (UIGraphics Context Draw) should be VASTLY more efficient for speed and memory when applied as a filter for AlamofireImage. As is the case with viewing less than 200 mb when processing my 200+ images, and they mostly load instantly, and not more than the 35 seconds that were required using the openEAGL methods. I know that these are not very scientific guidelines. I will use the tool if someone is very curious :)
And finally, if you need to pass a particular grayscale filter to AlamofireImage, hereβs how to do it: (note that you must import AlamofireImage into your class to use ImageFilter)
public struct GrayScaleFilter: ImageFilter { public init() { } public var filter: (UIImage) -> UIImage { return { image in return image.convertToGrayscale() ?? image } } }
To use it, create a filter as follows and go to af_setImage as follows:
let filter = GrayScaleFilter() imageView.af_setImage(withURL: url, filter: filter)