I am trying to process images in real time using the iPhone 6 at a speed of 240 frames per second. The problem is that when I shoot a video at such a speed, I canβt process the image fast enough, since I need to try every pixel to get the average value. Decreasing the resolution of the image will easily solve this problem, but I cannot figure out how to do it. Available AVCaptureDeviceFormat has parameters with 192x144 px, but with a speed of 30 frames per second. All 240fps options are large. This is how I take the data sample:
- (void)startDetection { const int FRAMES_PER_SECOND = 240; self.session = [[AVCaptureSession alloc] init]; self.session.sessionPreset = AVCaptureSessionPresetLow;
source share