OpenCV with Laplacian's formula for image detection blurred or not in iOS

Thanks for the help in advance.

I have a lot of R&D and search, but I can not find a solution for detecting a blurry image or not.

I used this https://github.com/BloodAxe/OpenCV-Tutorial and used the Laplacian formula to detect blur, but could not detect image blur

- (void) checkForBurryImage: (UIImage *) image {

cv::Mat matImage = [image toMat]; cv::Mat matImageGrey; cv::cvtColor(matImage, matImageGrey, CV_BGRA2GRAY); cv::Mat dst2 =[image toMat]; cv::Mat laplacianImage; dst2.convertTo(laplacianImage, CV_8UC1); cv::Laplacian(matImageGrey, laplacianImage, CV_8U); cv::Mat laplacianImage8bit; laplacianImage.convertTo(laplacianImage8bit, CV_8UC1); //------------------------------------------------------------- //------------------------------------------------------------- unsigned char *pixels = laplacianImage8bit.data; //------------------------------------------------------------- //------------------------------------------------------------- // unsigned char *pixels = laplacianImage8bit.data; int maxLap = -16777216; for (int i = 0; i < ( laplacianImage8bit.elemSize()*laplacianImage8bit.total()); i++) { if (pixels[i] > maxLap) maxLap = pixels[i]; } int soglia = -6118750; printf("\n maxLap : %i",maxLap); if (maxLap < soglia || maxLap == soglia) { printf("\n\n***** blur image *****"); }else printf("\nNOT a blur image"); } 

And I used the same code as Android, and its work is great, but in iOS it always gives me a positive value, so I think it doesn't work,

So please give me an idea or link or any suggestion.

+6
source share
5 answers

Use this:

 Laplacian(gray, laplacianImage, CV_64F); Scalar mean, stddev; // 0:1st channel, 1:2nd channel and 2:3rd channel meanStdDev(laplacianImage, mean, stddev, Mat()); double variance = stddev.val[0] * stddev.val[0]; double threshold = 2900; if (variance <= threshold) { // Blurry } else { // Not blurry } 
+4
source

Use this

 -(BOOL) checkForBurryImage:(cv::Mat) matImage {// Output:(cv::Mat &) outputFrame { cv::Mat finalImage; cv::Mat matImageGrey; cv::cvtColor(matImage, matImageGrey, CV_BGRA2GRAY); matImage.release(); cv::Mat newEX; const int MEDIAN_BLUR_FILTER_SIZE = 15; // odd number cv::medianBlur(matImageGrey, newEX, MEDIAN_BLUR_FILTER_SIZE); matImageGrey.release(); cv::Mat laplacianImage; cv::Laplacian(newEX, laplacianImage, CV_8U); // CV_8U newEX.release(); cv::Mat laplacianImage8bit; laplacianImage.convertTo(laplacianImage8bit, CV_8UC1); laplacianImage.release(); cv::cvtColor(laplacianImage8bit,finalImage,CV_GRAY2BGRA); laplacianImage8bit.release(); int rows = finalImage.rows; int cols= finalImage.cols; char *pixels = reinterpret_cast<char *>( finalImage.data); int maxLap = -16777216; for (int i = 0; i < (rows*cols); i++) { if (pixels[i] > maxLap) maxLap = pixels[i]; } int soglia = -6118750; pixels=NULL; finalImage.release(); BOOL isBlur = (maxLap < kBlurThreshhold)? YES : NO; return isBlur; 

}

+2
source

The following method uses OpenCV :

 - (BOOL) isImageBlurry:(UIImage *) image { // converting UIImage to OpenCV format - Mat cv::Mat matImage = [self convertUIImageToCVMat:image]; cv::Mat matImageGrey; // converting image color space (RGB) to grayscale cv::cvtColor(matImage, matImageGrey, CV_BGR2GRAY); cv::Mat dst2 = [self convertUIImageToCVMat:image]; cv::Mat laplacianImage; dst2.convertTo(laplacianImage, CV_8UC1); // applying Laplacian operator to the image cv::Laplacian(matImageGrey, laplacianImage, CV_8U); cv::Mat laplacianImage8bit; laplacianImage.convertTo(laplacianImage8bit, CV_8UC1); unsigned char *pixels = laplacianImage8bit.data; // 16777216 = 256*256*256 int maxLap = -16777216; for (int i = 0; i < ( laplacianImage8bit.elemSize()*laplacianImage8bit.total()); i++) { if (pixels[i] > maxLap) { maxLap = pixels[i]; } } // one of the main parameters here: threshold sets the sensitivity for the blur check // smaller number = less sensitive; default = 180 int threshold = 100; return (maxLap <= threshold); } 

Convert UIImage to OpenCV::Mat

 - (cv::Mat)convertUIImageToCVMat:(UIImage *)image { CGColorSpaceRef colorSpace = CGImageGetColorSpace(image.CGImage); CGFloat cols = image.size.width; CGFloat rows = image.size.height; cv::Mat cvMat(rows, cols, CV_8UC4); // 8 bits per component, 4 channels (color channels + alpha) CGContextRef contextRef = CGBitmapContextCreate(cvMat.data, // Pointer to data cols, // Width of bitmap rows, // Height of bitmap 8, // Bits per component cvMat.step[0], // Bytes per row colorSpace, // Colorspace kCGImageAlphaNoneSkipLast | kCGBitmapByteOrderDefault); // Bitmap info flags CGContextDrawImage(contextRef, CGRectMake(0, 0, cols, rows), image.CGImage); CGContextRelease(contextRef); return cvMat; } 
+1
source

Try Is there a way to detect that the image is blurry?

and read the following: http://www.cs.cmu.edu/~htong/pdf/ICME04_tong.pdf .

In principle, if there are not many high-frequency components in the image, it is blurry.

0
source

iOS now supports Metal Performance Shaders, which can do this for you. There is a shader, literally called MPSImageLaplacian, and then another, called MPSImageStatisticsMeanAndVariance. Apply one after another, and you will get your "Dispersion of Laplacians" :). For more information on how to do this, see: https://medium.com/@salqadri/blur-detection-via-metal-on-ios-16dd02cb1558

0
source

Source: https://habr.com/ru/post/970443/


All Articles