On the desktop OS X, this is possible using CIFilter on CALayer . However, according to the CALayer documentation :
While the CALayer class provides this property, Core Image is not available on iOS. Currently, undefined filters are available for this property.
It is best to implement this using the OpenGL fragment shader. The hard part is getting access to the content of the view in the region that you want to convert in real time.
I still need to see something that works on live (scrolling) content on iOS. All similar animations, such as Mail.app garbage, can animations, page curls, etc. They work on static content (for example, in a view displayed on an image only once, and then convert that image).
At the moment, I see that this is possible:
- Display the field of view on the image at a certain interval
- Convert using your opengl shader
- Mark the output of opengl on top of the actual content.
Because you will need to poll the view, make the image and display it as an OpenGL view that covers only part of the main screen, I expect performance to be optimal.
source share