How to recognize whitePixelValue when a person smiles?

using CvRect, I can detect the coordinate of faces

detectFace(IplImage * pImg, CvHaarClassifierCascade * pCascade, CvMemStorage * pStorage) 

But my problem is: how do I know whitePixelValue when a person smiles?

And what kind of smile shift is right? 150 is accurate?

The haarcascade smile doesn't work at all. You need to do something with logic with only white pixels.

Please, help!!!

Update: I think my generosity will be empty. on this topic. I was looking for an algorithm: (

+6
source share
2 answers

As far as I understand, you want to find a smile based on the color difference between teeth and skin, right? This probably makes sense, since the intensity of the pixels on the lips is very different from the pixels on the teeth. If so, I would suggest two approaches.

First you need to calculate the Laplacian face region transformation. The coordinates with the maximum values ​​of the Laplacian will correspond to the largest difference in intensities of neighboring pixels. I think the smile contains the most contrasting pixels. You must select the threshold again, but in this case it does not depend on the lighting conditions.

The second idea is similar to the first, except that you need to compare the pixels with the average of all the pixels in the face. Here, pixels whose values ​​are relatively large relative to the average are treated as teeth pixels. By the way, something like cvAdaptiveThreshold might help.

+7
source

Hi, you checked this project https://github.com/beetlebugorg/PictureMe
This is a really great project. I think you will find what you need.

0
source

Source: https://habr.com/ru/post/887363/


All Articles