CGContextStrokePath does not work when scaling and drawing images

I draw the lines according to the touchesMoved: method, and it usually works fine. But when I enlarge the image and draw, the previously drawn lines shift and become more blurry, eventually disappearing. I tried using UIPinchGestureRecognizer and just increased the frame of myImageView (only for multi-touch events), but the problem occurs in both directions. Here's the code to draw:

 - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { NSArray *allTouches = [touches allObjects]; int count = [allTouches count]; if(count==1){//single touch case for drawing line UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:myImageView]; UIGraphicsBeginImageContext(myImageView.frame.size); [drawImage.image drawInRect:CGRectMake(0, 0, myImageView.frame.size.width, myImageView.frame.size.height)]; CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 2.0); CGContextBeginPath(UIGraphicsGetCurrentContext()); CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y); CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y); CGContextStrokePath(UIGraphicsGetCurrentContext()); drawImage.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); lastPoint = currentPoint; } else{//multi touch case // handle pinch/zoom } } 

Here is an image drawn without scaling:

enter image description here

And this is an image depicting the problem after scaling with a red arrow showing the segment that was already drawn before scaling (as shown in the previous image). Image is blurry and offset:

enter image description here

You can also notice that the part of the line drawn to the end is not affected, and the phenomenon occurs for the lines pushed back. I believe the reason for this is because the image size attributes are lost when I zoom in / out, which probably causes blurring and shifting, but I'm not so sure!

EDIT . I uploaded a short video to show what was happening. This is kind of entertaining ...

EDIT 2 - Here 's a one-view sample application focusing on a problem.

+6
source share
3 answers

I downloaded your project and I found that the problem is autoresist. The following steps will help solve this problem:

Step 1. Comment on line 70:

 drawImage.frame = CGRectMake(0, 0, labOrderImgView.frame.size.width, labOrderImgView.frame.size.height); 

in your touchesMoved method.

Step 2. Add one line of code after drawImage alloc ed (line 90) in the viewDidLoad method:

 drawImage.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight; 

Then the error is fixed.

+4
source

I implemented the behavior as follows:

  • You must remember all the coordinates of your path (MODEL).
  • Draw your path into a temporary submission of imageView.
  • When the user starts to pinch / enlarge the image - do nothing, i.e. path will scale iOS
  • The moment the user has finished scaling, change your path correctly using your model.

If you save the path as an image, you will get poor looking scaling as a result. In addition, it does not draw the path directly to the image - draw in a transparent form and separate them together at the end of editing.

+1
source

You always just draw the context of the image, the size of your image is, of course, blurred, because you do not adapt to a higher resolution when enlarged. It would be wise instead to create a UIBezierPath once and just add a line to it (using addLineToPoint: in the touchesMoved method, then draw it in the custom drawRect method via [bezierPath stroke] . You can also simply add CAShapeLayer as an image CAShapeLayer -view and set its path property to the CGPath property of the CGPath created bezierPath .

See Draw Bezier Curves with your finger in iOS? for example.

+1
source

Source: https://habr.com/ru/post/908453/


All Articles