2012-08-07 60 views
3

所以我正在使用iOS 4.2來添加縮放和平移到我的應用程序。我已經實現了UIPinchGestureRecognizer和UIPanGestureRecognizer的一個實例。在我看來,只有其中一個是一次識別一個手勢。特別是,後者僅在一個手指向下時起作用,而前者在第二個手指存在時起作用。這沒關係,但它有一些副作用,我認爲會造成用戶體驗質量較差。是否有一個手勢識別器同時處理捏和平底鍋?

當您放下兩根手指然後移動其中一根手指時,圖像會像應該放大(放大)一樣,但手指下面的像素不再位於手指下方。圖像從圖像的中心縮放,而不是兩個手指之間的中點。而且這個中心點本身在動。我希望該中心點的運動能夠決定整體圖像的平移。

幾乎所有的iOS應用程序都具有相同的行爲,圖像放大或縮小圖像中心周圍,而不是手指跟蹤手指下的像素?

在我看來,創建自定義手勢識別器是解決此問題的正確設計方法,但在我看來,有人會爲了商業免費下載和使用而創建這樣的識別器。有沒有這樣的UIGestureRecognizer?

+0

我注意到iPad上的safari使用手指跟蹤像素,這讓我想知道這是蘋果專有的行爲,我不應該模仿?????? – user574771 2012-08-08 06:58:37

+0

我注意到谷歌地圖有相同的行爲,所以我懷疑這種行爲有任何專利... – user574771 2012-08-20 21:20:15

回答

2

所以我創建的自定義手勢識別中沒有一個光給我一個更好的解決方案,取得了預期的效果。下面是關鍵代碼片段,它允許自定義識別器指示視圖應該重新定位的位置以及它的新比例應該以質心作爲平移和縮放效果的中心,以便手指之下的像素保持在手指之下時間,除非手指旋轉,這是不支持的,我不能做任何事情阻止他們從這樣的姿勢。這個手勢識別器用兩個手指同時平移和縮放。我需要稍後爲一個手指平移添加支持,即使兩個手指中的一個被擡起。

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event 
{ 
    // We can only process if we have two fingers down... 
    if (FirstFinger == nil || SecondFinger == nil) 
     return; 

    // We do not attempt to determine if the first finger, second finger, or 
    // both fingers are the reason for this method call. For this reason, we 
    // do not know if either is stale or updated, and thus we cannot rely 
    // upon the UITouch's previousLocationInView method. Therefore, we need to 
    // cache the latest UITouch's locationInView information each pass. 

    // Break down the previous finger coordinates... 
    float A0x = PreviousFirstFinger.x; 
    float A0y = PreviousFirstFinger.y; 
    float A1x = PreviousSecondFinger.x; 
    float A1y = PreviousSecondFinger.y; 
    // Update our cache with the current fingers for next pass through here... 
    PreviousFirstFinger = [FirstFinger locationInView:nil]; 
    PreviousSecondFinger = [SecondFinger locationInView:nil]; 
    // Break down the current finger coordinates... 
    float B0x = PreviousFirstFinger.x; 
    float B0y = PreviousFirstFinger.y; 
    float B1x = PreviousSecondFinger.x; 
    float B1y = PreviousSecondFinger.y; 


    // Calculate the zoom resulting from the two fingers moving toward or away from each other... 
    float OldScale = Scale; 
    Scale *= sqrt((B0x-B1x)*(B0x-B1x) + (B0y-B1y)*(B0y-B1y))/sqrt((A0x-A1x)*(A0x-A1x) + (A0y-A1y)*(A0y-A1y)); 

    // Calculate the old and new centroids so that we can compare the centroid's movement... 
    CGPoint OldCentroid = { (A0x + A1x)/2, (A0y + A1y)/2 }; 
    CGPoint NewCentroid = { (B0x + B1x)/2, (B0y + B1y)/2 };  

    // Calculate the pan values to apply to the view so that the combination of zoom and pan 
    // appear to apply to the centroid rather than the center of the view... 
    Center.x = NewCentroid.x + (Scale/OldScale)*(self.view.center.x - OldCentroid.x); 
    Center.y = NewCentroid.y + (Scale/OldScale)*(self.view.center.y - OldCentroid.y); 
} 

視圖控制器通過將新比例和中心分配給相關視圖來處理事件。我注意到其他手勢識別器往往會讓控制器做一些數學運算,但我試圖在識別器中做所有的數學運算。

-(void)handlePixelTrack:(PixelTrackGestureRecognizer*)sender 
{ 
    sender.view.center= sender.Center; 
    sender.view.transform = CGAffineTransformMakeScale(sender.Scale, sender.Scale); 
} 
1

更簡單的解決方案是將您的視圖放在滾動視圖中。然後你可以免費捏和平底鍋。否則,您可以將平移和縮放手勢代表設置爲self,並同時返回YES以進行shouldRecognize。至於放大到用戶手指的中心位置,我從來沒有正確解決過這個問題,但它涉及在改變視圖的層次(我認爲)之前操縱視圖層的anchorPoint

+0

滾動視圖添加問題,如反彈和酒吧。我認爲結合識別器不會解決識別手指之間點的問題...... – user574771 2012-08-08 01:34:38

+0

您可以輕鬆禁用彈跳和酒吧......它們只是屬性。當你在捏手勢上獲得'locationInView'時,它返回中點。 – borrrden 2012-08-08 01:36:15

+0

我試圖禁用反彈,但它似乎並沒有完全禁用從滾動視圖的邊界伸展。我想我可以再試一次。但是,雖然中點是爲我們計算的,那麼像素是否會追蹤他們的手指?我不認爲他們做到了。 – user574771 2012-08-08 02:49:53

7

對不起,趕時間,但這是我用於演示應用程序之一的代碼,它可以在不使用滾動視圖的情況下同時捏住縮放和平移。

不要忘記,以符合UIGestureRecognizerDelegate協議

如果你不能夠在同一時間同時獲得捏和平移,也許是因爲你錯過了這個方法:

-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer 
{ 
    return YES; 
} 

下面是完整的源代碼:

#import "ViewController.h" 
#import <QuartzCore/QuartzCore.h> 

@interface ViewController() 

@end 

@implementation ViewController 

- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 
    // Do any additional setup after loading the view, typically from a nib. 

    isEditing = false; 

    photoView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 460)]; 
    [photoView setImage:[UIImage imageNamed:@"photo.png"]]; 
    photoView.hidden = YES; 

    maskView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 460)]; 
    [maskView setImage:[UIImage imageNamed:@"maskguide.png"]]; 
    maskView.hidden = YES; 

    displayImage = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 460)]; 

    UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(handlePan:)]; 
    UIPinchGestureRecognizer *pinchGesture = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:@selector(handlePinch:)]; 

    [panGesture setDelegate:self]; 
    [pinchGesture setDelegate:self]; 

    [photoView addGestureRecognizer:panGesture]; 
    [photoView addGestureRecognizer:pinchGesture]; 
    [photoView setUserInteractionEnabled:YES]; 

    [panGesture release]; 
    [pinchGesture release]; 

    btnEdit = [[UIButton alloc] initWithFrame:CGRectMake(60, 400, 200, 50)]; 
    [btnEdit setBackgroundColor:[UIColor blackColor]]; 
    [btnEdit setTitle:@"Start Editing" forState:UIControlStateNormal]; 
    [btnEdit addTarget:self action:@selector(toggleEditing) forControlEvents:UIControlEventTouchUpInside]; 

    [[self view] addSubview:displayImage]; 
    [[self view] addSubview:photoView]; 
    [[self view] addSubview:maskView]; 
    [[self view] addSubview:btnEdit]; 

    [self updateMaskedImage]; 
} 

- (void)viewDidUnload 
{ 
    [super viewDidUnload]; 
    // Release any retained subviews of the main view. 
} 

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation 
{ 
    return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown); 
} 

-(void)dealloc 
{ 
    [btnEdit release]; 

    [super dealloc]; 
} 

#pragma mark - 
#pragma mark Update Masked Image Method 
#pragma mark - 

-(void)updateMaskedImage 
{ 
    maskView.hidden = YES; 

    UIImage *finalImage = 
    [self maskImage:[self captureView:self.view] 
      withMask:[UIImage imageNamed:@"mask.png"]]; 


    maskView.hidden = NO; 

    //UIImage *finalImage = [self maskImage:photoView.image withMask:[UIImage imageNamed:@"mask.png"]]; 

    [displayImage setImage:finalImage]; 
} 

- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage { 

    CGImageRef maskRef = maskImage.CGImage; 

    CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef), 
             CGImageGetHeight(maskRef), 
             CGImageGetBitsPerComponent(maskRef), 
             CGImageGetBitsPerPixel(maskRef), 
             CGImageGetBytesPerRow(maskRef), 
             CGImageGetDataProvider(maskRef), NULL, false); 

    CGImageRef masked = CGImageCreateWithMask([image CGImage], mask); 
    return [UIImage imageWithCGImage:masked]; 

} 

#pragma mark - 
#pragma mark Touches Began 
#pragma mark - 

// adjusts the editing flag to make dragging and drop work 
-(void)toggleEditing 
{ 
    if(!isEditing) 
    { 
     isEditing = true; 

     NSLog(@"editing..."); 

     [btnEdit setTitle:@"Stop Editing" forState:UIControlStateNormal]; 

     displayImage.hidden = YES; 
     photoView.hidden = NO; 
     maskView.hidden = NO; 
    } 
    else 
    { 
     isEditing = false; 

     [self updateMaskedImage]; 

     NSLog(@"stopped editting"); 

     [btnEdit setTitle:@"Start Editing" forState:UIControlStateNormal]; 

     displayImage.hidden = NO; 
     photoView.hidden = YES; 
     maskView.hidden = YES; 
    } 
} 

/* 
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event 
{ 
    if(isEditing) 
    { 
     UITouch *finger = [touches anyObject]; 
     CGPoint currentPosition = [finger locationInView:self.view]; 

     //[maskView setCenter:currentPosition]; 
     //[photoView setCenter:currentPosition]; 
     if([touches count] == 1) 
     { 
      [photoView setCenter:currentPosition]; 
     } 
     else if([touches count] == 2) 
     { 

     } 
    } 
} 
*/ 

-(void)handlePan:(UIPanGestureRecognizer *)recognizer 
{  
    CGPoint translation = [recognizer translationInView:self.view]; 
    recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x, 
             recognizer.view.center.y + translation.y); 
    [recognizer setTranslation:CGPointMake(0, 0) inView:self.view]; 
} 

-(void)handlePinch:(UIPinchGestureRecognizer *)recognizer 
{  
    recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale); 
    recognizer.scale = 1; 
} 

-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer 
{ 
    return YES; 
} 

#pragma mark - 
#pragma mark Capture Screen Function 
#pragma mark - 

- (UIImage*)captureView:(UIView *)yourView 
{ 
    UIGraphicsBeginImageContextWithOptions(yourView.bounds.size, yourView.opaque, 0.0); 
    CGContextRef context = UIGraphicsGetCurrentContext(); 
    [yourView.layer renderInContext:context]; 
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); 
    UIGraphicsEndImageContext(); 
    return image; 
} 

#pragma mark - 

@end 
+0

我從你的代碼中取幾個元素,他們的工作,你說他們會。但行爲不是我想要的。我希望像素跟蹤像iPad/Safari一樣的手指。但是,如果有人向我展示模仿Safari的行爲存在法律問題,那麼我會將其標記爲目前爲止的最佳答案。 – user574771 2012-08-08 19:51:23