IPhone UIImageView Image Enhancement

I am trying to implement scalable scaling / output for PhotoView (an instance of UIImageView) using CGAffinTransformScale (scheduling the use of rotation, so it cannot count on frames to increase and will add a subview, so UIScrollView will be more complicated, I think). In any case, the concept was simple enough to understand, and the code assembled very quickly ... Since then I tried to solve the same two (related!) Problems using three different approaches, and I can not do this: 1- My code somehow loses track of scoring in the middle of the increase, from count = 2 to count = 1 and vice versa on the iPhone, but not for the simulator. 2- Touch points one and two continue to jump back and forth a few pixels each turn, causing the image to compress and enlarge sequentially and quickly, although in general the effect is one of scaling or scaling according to the needs of the user (both iPhone and simulator).

here is the code:

#import "PhotoView.h" @implementation PhotoView; @synthesize originalCenter, distance, zooming; - (id)initWithFrame:(CGRect)frame { if (self = [super initWithFrame:frame]) { // Initialization code self.userInteractionEnabled = YES; self.multipleTouchEnabled = YES; zooming = NO; } return self; } float distanceBetweenTwoPoints(CGPoint point1, CGPoint point2) { NSLog(@"point1 x: %5.2f point 2 x: %5.2f ---- point 1 y: %5.2f point 2 y: %5.2f",point1.x,point2.x,point1.y,point2.y); return (sqrt(pow(point1.x -point2.x,2) + pow(point1.y - point2.y,2))); } -(void) touchesBegan: (NSSet *) touches withEvent:(UIEvent *) event { if ([touches count] > 1) { NSLog(@"^^^^^^^^^^^^^^^Tocuhes began with double touch!"); distance = distanceBetweenTwoPoints([[[touches allObjects] objectAtIndex:0] locationInView:self], [[[touches allObjects] objectAtIndex:1] locationInView:self]); zooming = YES; } else { zooming = NO; origianlCenter = [[[touches allObjects] objectAtIndex:0] locationInView:self]; NSLog(@">>>>>>>>>>>>Touches began with single touch"); } } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { if (zooming) NSLog(@"!!!!!!!!!end zoom!!!!!!!"); zooming = NO; if ([[touches anyObject] tapCount] == 2) { UITouch *thisTouch = [touches anyObject]; CGPoint thisPoint = [thisTouch locationInView:self]; } } - (void) touchesMoved: (NSSet *) touches withEvent:(UIEvent *) event { if ([touches count] > 1 && zooming) { // ignore if user added a second finger touch float distanceNew = distanceBetweenTwoPoints([[[touches allObjects] objectAtIndex:0] locationInView:self], [[[touches allObjects] objectAtIndex:1] locationInView:self]); if (distance <= 0.f) { // should never be true - but it is sometimes!!! distance = distanceNew; } float delta = 1.0f + ((distanceNew-distance)/distance); self.transform = CGAffineTransformScale(self.transform, delta, delta); distance = distanceNew; } else { if (zooming) { NSLog(@"*************shouldn't be here********* %d",[touches count]); return; } CGPoint thisPoint = [[[touches allObjects] objectAtIndex:0] locationInView:self]; self.transform = CGAffineTransformTranslate(self.transform, thisPoint.x-originalCenter.x, thisPoint.y-originalCenter.y); } } - (void)dealloc { [super dealloc]; } @end 

Log Example:

 ^^^^^^^^^^^^^^^Tocuhes began with double touch! point1 x: 87.33 point 2 x: 235.63 ---- point 1 y: 322.30 point 2 y: 117.09 point1 x: 90.76 point 2 x: 232.02 ---- point 1 y: 318.29 point 2 y: 123.51 point1 x: 86.22 point 2 x: 236.71 ---- point 1 y: 323.30 point 2 y: 117.42 point1 x: 89.51 point 2 x: 232.38 ---- point 1 y: 319.47 point 2 y: 123.47 point1 x: 84.97 point 2 x: 237.02 ---- point 1 y: 324.48 point 2 y: 116.56 *************shouldn't be here********* 1 point1 x: 88.49 point 2 x: 232.52 ---- point 1 y: 321.27 point 2 y: 122.91 *************shouldn't be here********* 1 point1 x: 83.95 point 2 x: 237.11 ---- point 1 y: 327.21 point 2 y: 116.96 !!!!!!!!!end zoom!!!!!!! 

I am beginning to suspect that I am losing track of touch points due to CGAffinTransformScale; however, I did not find anything on the Internet to suggest that this be a problem. Any tips (including "read xyz documentation") would be appreciated!

Thanks in advance.

+4
source share
2 answers

Generally speaking, whenever you implement continuous user interface behavior, you need to measure it against what's not changing.

So, if your touch leads to a change in the View transformation, you should measure the strokes against what doesn't change - for example, the parent view. So, instead of calling:

 [touch locationInView:self] 

you should use

 [touch locationInView:[self superview]] 

I'm not sure if this will fix your problem, but it will fix one of the possible causes of your problems.

+6
source

Perhaps the answer is from the field on the left, but an alternative could be to place a UIImageView inside a UIScrollView, define the viewForZoomingInScrollView: method viewForZoomingInScrollView: in your delegate, scroll view and set maximumZoomScale / minimumZoomScale , and you will have to scale your zoom without having to do the calculations to adjust the transformation myself? I just did this in a recent project and it worked well.

+17
source

Source: https://habr.com/ru/post/1299367/


All Articles