I am trying to add a B & W filter to ARSCNView camera images and then render colored AR objects on it.

I'm almost there with the following code added to the top - (void)renderer:(id<SCNSceneRenderer>)aRenderer updateAtTime:(NSTimeInterval)time
CVPixelBufferRef bg=self.sceneView.session.currentFrame.capturedImage;
if(bg){
char* k1 = CVPixelBufferGetBaseAddressOfPlane(bg, 1);
if(k1){
size_t x1 = CVPixelBufferGetWidthOfPlane(bg, 1);
size_t y1 = CVPixelBufferGetHeightOfPlane(bg, 1);
memset(k1, 128, x1*y1*2);
}
}
This works very fast on mobile devices, but here's the thing: sometimes a color frame is displayed. I checked and my filtering code is executing, but I guess it's too late, the SceneKit pipeline has already processed the camera input.
Calling code earlier would help, but updateAtTimeis the earliest point that you can add custom frame-by-frame code.
Receiving frame notifications can help, but it looks like AVCapturesession is not available.
Metal ARKit , RGB, , , SceneKit.
, .
, BW?