Use core-image in 3d

I have a working setting Core Video(a frame shot from a USB camera via QTKit), and the current frame is displayed as a texture on a harsh plane in 3d space in a subclass NSOpenGLView. so far so good, but I would like to use some filter Core Imageon this frame. I now have a basic code setup, and it displays my raw video frame, as before, but the final processed output CIImageis rendererd in the form of a square aligned on the screen in the view. he feels the image shine over my 3D rendering. this is what i don't want!

I am looking for a way to process my video frame (a CVOpenGLTextureRef) with Core Imageand simply visualize the resulting image on my plane in 3d. Do I need to use offscreen rendering (store the viewport, install new viewports, models and perspective matrices and visualize in FBO), or is there an easier way?

early!

+3
source share
1 answer

Try GPUImage ! It is easy to use and faster than processing CoreImage. It uses predefined or custom shaders (GLSL)

0
source

Source: https://habr.com/ru/post/1735438/


All Articles