How to handle CIFilter using a processor instead of a GPU?

Does anyone know how to tell the main image the CIImage process via CIFilter using a processor instead of a GPU? I need to process very large images, and I get strange results using the GPU. I don't care how long the processor takes, it will be fine.

+3
source share
3 answers

kCIContextUseSoftwareRenderer has a key meaning:

+ (CIContext*)coreContextFor:(NSGraphicsContext *)context forceSoftware:(BOOL)forceSoftware
{
    //CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    NSDictionary *contextOptions = [NSDictionary dictionaryWithObjectsAndKeys:
                                    (id)colorSpace, kCIContextWorkingColorSpace,
                                    (id)colorSpace, kCIContextOutputColorSpace,
                                    [NSNumber numberWithBool:forceSoftware], kCIContextUseSoftwareRenderer,
                                    nil];
    CIContext* result = [CIContext contextWithCGContext:(CGContext *)[context graphicsPort] options:contextOptions];
    CGColorSpaceRelease(colorSpace);
    return result;
}

(CPU) , ... , , . CoreImage , , . , CPU , - , .

+3

, CIkernel , RGB, unsigned char [] CIImage.

0

Swift:

func coreContextFor(context : CGContext,forceSoftware force : Bool) -> CIContext {
    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let options : [String:AnyObject] = [
        kCIContextWorkingColorSpace: colorSpace!,
        kCIContextOutputColorSpace : colorSpace!,
        kCIContextUseSoftwareRenderer : NSNumber(bool: force)
    ]

    return CIContext(CGContext: context, options: options)
}
0

Source: https://habr.com/ru/post/1768802/


All Articles