Pixel Color Matching

For image scanning purposes, I need a pixel (which I can get from UIImage) to match (for a certain percentage) the preset color.

Say pink. When I scan the image for pixels that are pink, I want the function to return a percentage of what the RGB value in the pixel looks like my preset RGB value. So I would like all (well, most) pink pixels to become “visible” to me, and not just exact matches.

Anyone familiar with this approach? How would you do something like this?

Thanks in advance.

UPDATE: thank you all for your answers. I accepted the answer from Damien Pollet because it helped me further, and I came to the conclusion that calculating the vector difference between the two RGB colors makes this ideal for me (at the moment). This may require some tweaking over time, but now I use the following (in lens c):

float difference = pow( pow((red1 - red2), 2) + pow((green1 - green2), 2) + pow((blue1 - blue2), 2), 0.5 );

If this difference is below 85, I accept the color as my target color. Since my algorithm does not need accuracy, I am fine with this solution :)

UPDATE 2: there are more in my search. I found the following URL, which can be quite (understated) useful for you if you are looking for something like this.

http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios

+3
source share
2
+2

: LAB . deltaE.

LAB ( ) : , .

, , , (, ). , + deltaE:

  • 3 ,

  • refence

  • ...

+2

Source: https://habr.com/ru/post/1785629/


All Articles