As suggested by others, you will want to offload this work from the CPU to the GPU in order to have any decent processing performance on these mobile devices.
To this end, I created an open source framework for iOS called GPUImage , which makes it relatively easy for such accelerated image processing. This requires support for OpenGL ES 2.0, but every iOS device sold in the last couple of years has this (statistics show about 97% of all iOS devices in this area).
As part of this structure, one of the source filters that I linked is pixel. The SimpleVideoFilter sample application shows how to use this, with a slider that controls the width of the pixel in the processed image:

This filter is the result of a fragmented shader with the following GLSL code:
varying highp vec2 textureCoordinate; uniform sampler2D inputImageTexture; uniform highp fractionalWidthOfPixel; void main() { highp vec2 sampleDivisor = vec2(fractionalWidthOfPixel); highp vec2 samplePos = textureCoordinate - mod(textureCoordinate, sampleDivisor); gl_FragColor = texture2D(inputImageTexture, samplePos ); }
In my tests, GPU-based filters like this perform 6-24X faster than equivalent processor-related processing routines for images and videos on iOS. The above structure should be easily implemented in the application, and the source code will be freely available for you to configure as you see fit.
source share