I'm going crazy here.
I have a function that should return a floating point number:
- (float) getHue:(UIColor *)original
{
NSLog(@"getHue");
const CGFloat *componentColors = CGColorGetComponents(original.CGColor);
float red = componentColors[0];
float green = componentColors[1];
float blue = componentColors[2];
float h = 0.0f;
float maxChannel = fmax(red, fmax(green, blue));
float minChannel = fmin(red, fmin(green, blue));
if (maxChannel == minChannel)
h = 0.0f;
else if (maxChannel == red)
h = 0.166667f * (green - blue) / (maxChannel - minChannel) + 0.000000f;
else if (maxChannel == green)
h = 0.166667f * (blue - red) / (maxChannel - minChannel) + 0.333333f;
else if (maxChannel == blue)
h = 0.166667f * (red - green) / (maxChannel - minChannel) + 0.666667f;
else
h = 0.0f;
if (h < 0.0f)
h += 1.0f;
NSLog(@"getHue results: %f", h);
return h;
}
NSLog will check it correctly (i.e. 0.005), but the actual return value of the function is NULL.
I tried to get this value differently and it never works.
float originalHue = [self getHue:original];
leads to a building error, as it says: "incompatible types during initialization"
float *originalHue = [self getHue:original];
returns zero return.
I tried other ways, but it never gets the correct value.
Any thoughts?
Hi guys Andre
source
share