Erase and erase an image in UIImageVIew by touch using coregraphics

My question is the same as mentioned in here . I also use two images in my application, and all I need to do is erase the top image with a touch. Then remove (if necessary) the erasable part by touching. I use the following code to remove the top image. There is also a problem in this approach. This is because the images are large and I use Aspect Fit content mode to display them correctly. When I touch the screen, it erases an untouched place in the corner. I think some correction is needed to calculate the touch point. Any help would be appreciated.

The second problem is how to remove the erasable part by touch?

UIGraphicsBeginImageContext(self.imgTop.image.size); [self.imgTop.image drawInRect:CGRectMake(0, 0, self.imgTop.image.size.width, self.imgTop.image.size.height)]; self.frame.size.width, self.frame.size.height)]; CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); GContextSetLineWidth(UIGraphicsGetCurrentContext(), pinSize); CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 0, 0, 1.0); CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeCopy); CGContextBeginPath(UIGraphicsGetCurrentContext()); CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y); CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y); CGContextStrokePath(UIGraphicsGetCurrentContext()); self.imgTop.contentMode = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); 
+6
source share
1 answer

Your code is rather ambiguous: do you create a context with imgTop inside and then mix black with kCGBlendModeCopy ? This will cause the black color to be copied to imgTop. I assume you want to set the layer content property, then?

In any case, this class does what you need. There are only a few interesting methods (they are at the top), the rest are just init... properties or routines.

 @interface EraseImageView : UIView { CGContextRef context; CGRect contextBounds; } @property (nonatomic, retain) UIImage *backgroundImage; @property (nonatomic, retain) UIImage *foregroundImage; @property (nonatomic, assign) CGFloat touchWidth; @property (nonatomic, assign) BOOL touchRevealsImage; - (void)resetDrawing; @end @interface EraseImageView () - (void)createBitmapContext; - (void)drawImageScaled:(UIImage *)image; @end @implementation EraseImageView @synthesize touchRevealsImage=_touchRevealsImage, backgroundImage=_backgroundImage, foregroundImage=_foregroundImage, touchWidth=_touchWidth; #pragma mark - Main methods - - (void)createBitmapContext { // create a grayscale colorspace CGColorSpaceRef grayscale=CGColorSpaceCreateDeviceGray(); /* TO DO: instead of saving the bounds at the moment of creation, override setFrame:, create a new context with the right size, draw the previous on the new, and replace the old one with the new one. */ contextBounds=self.bounds; // create a new 8 bit grayscale bitmap with no alpha (the mask) context=CGBitmapContextCreate(NULL, (size_t)contextBounds.size.width, (size_t)contextBounds.size.height, 8, (size_t)contextBounds.size.width, grayscale, kCGImageAlphaNone); // make it white (touchRevealsImage==NO) CGFloat white[]={1., 1.}; CGContextSetFillColor(context, white); CGContextFillRect(context, contextBounds); // setup drawing for that context CGContextSetLineCap(context, kCGLineCapRound); CGContextSetLineJoin(context, kCGLineJoinRound); CGColorSpaceRelease(grayscale); } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch=(UITouch *)[touches anyObject]; // the new line that will be drawn CGPoint points[]={ [touch previousLocationInView:self], [touch locationInView:self] }; // setup width and color CGContextSetLineWidth(context, self.touchWidth); CGFloat color[]={(self.touchRevealsImage ? 1. : 0.), 1.}; CGContextSetStrokeColor(context, color); // stroke CGContextStrokeLineSegments(context, points, 2); [self setNeedsDisplay]; } - (void)drawRect:(CGRect)rect { if (self.foregroundImage==nil || self.backgroundImage==nil) return; // draw background image [self drawImageScaled:self.backgroundImage]; // create an image mask from the context CGImageRef mask=CGBitmapContextCreateImage(context); // set the current clipping mask to the image CGContextRef ctx=UIGraphicsGetCurrentContext(); CGContextSaveGState(ctx); CGContextClipToMask(ctx, contextBounds, mask); // now draw image (with mask) [self drawImageScaled:self.foregroundImage]; CGContextRestoreGState(ctx); CGImageRelease(mask); } - (void)resetDrawing { // draw black or white CGFloat color[]={(self.touchRevealsImage ? 0. : 1.), 1.}; CGContextSetFillColor(context, color); CGContextFillRect(context, contextBounds); [self setNeedsDisplay]; } #pragma mark - Helper methods - - (void)drawImageScaled:(UIImage *)image { // just draws the image scaled down and centered CGFloat selfRatio=self.frame.size.width/self.frame.size.height; CGFloat imgRatio=image.size.width/image.size.height; CGRect rect={0.,0.,0.,0.}; if (selfRatio>imgRatio) { // view is wider than img rect.size.height=self.frame.size.height; rect.size.width=imgRatio*rect.size.height; } else { // img is wider than view rect.size.width=self.frame.size.width; rect.size.height=rect.size.width/imgRatio; } rect.origin.x=.5*(self.frame.size.width-rect.size.width); rect.origin.y=.5*(self.frame.size.height-rect.size.height); [image drawInRect:rect]; } #pragma mark - Initialization and properties - - (id)initWithCoder:(NSCoder *)aDecoder { if ((self=[super initWithCoder:aDecoder])) { [self createBitmapContext]; _touchWidth=10.; } return self; } - (id)initWithFrame:(CGRect)frame { if ((self=[super initWithFrame:frame])) { [self createBitmapContext]; _touchWidth=10.; } return self; } - (void)dealloc { CGContextRelease(context); [super dealloc]; } - (void)setBackgroundImage:(UIImage *)value { if (value!=_backgroundImage) { [_backgroundImage release]; _backgroundImage=[value retain]; [self setNeedsDisplay]; } } - (void)setForegroundImage:(UIImage *)value { if (value!=_foregroundImage) { [_foregroundImage release]; _foregroundImage=[value retain]; [self setNeedsDisplay]; } } - (void)setTouchRevealsImage:(BOOL)value { if (value!=_touchRevealsImage) { _touchRevealsImage=value; [self setNeedsDisplay]; } } @end 

Some notes:

  • This class stores the two images you need. It has the touchRevealsImage property to set the mode to draw or erase, and you can set the line width.

  • When initialized, it creates a CGBitmapContextRef , grayscale, 8bpp, no alpha, the same size of the view. This context is used to store the mask that will be applied to the foreground image.

  • Each time you move your finger across the screen, a line is drawn on CGBitmapContextRef using CoreGraphics, white to show the image, and black to hide it. Thus, we save a black and white picture.

  • drawRect: procedure drawRect: just draws a background, then creates a CGImageRef from CGBitmapContextRef and applies it to the current context as a mask. Then draws a foreground image. To draw images, use - (void)drawImageScaled:(UIImage *)image , which simply draws a scaled and centered image.

  • If you plan to resize the view, you must implement a method to copy or recreate the mask with a new size, overriding - (void)setFrame:(CGRect)frame .

  • The method - (void)reset just clears the mask.

  • Even if there is no alpha channel in the raster context, the grayscale color space is used, so every time you set the color, I had to specify two components.

Sample application with <code> EraseImageView </code> class

+6
source

Source: https://habr.com/ru/post/912520/


All Articles