2012-04-04 59 views
2

是否有可能檢測到每個被觸摸的像素?更具體地說,當用戶觸摸屏幕時,是否有可能跟蹤用戶觸摸的點集羣的所有x-y座標?我如何分辨用戶用拇指進行繪圖時以及用手指尖進行繪製時的區別?我想根據用戶觸摸屏幕來反映畫筆的差異,並且還想跟蹤所有正在觸摸的像素。從觸摸點獲取單個像素

我目前使用下面的代碼從蘋果開發者網站的GLPaint樣本:

http://developer.apple.com/library/ios/#samplecode/GLPaint/Introduction/Intro.html

樣本代碼允許預定義的畫筆大小繪製和跟蹤沿途的X-Y座標。如何根據用戶觸摸屏幕的方式更改畫筆並隨着時間的推移跟蹤所有正在觸摸的像素?

// Drawings a line onscreen based on where the user touches 

- (void) renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end 

{ 
    NSLog(@"x:%f y:%f",start.x, start.y); 

    static GLfloat*   vertexBuffer = NULL; 

    static NSUInteger  vertexMax = 64; 

    NSUInteger    vertexCount = 0, 

        count, 

        i; 



    [EAGLContext setCurrentContext:context]; 

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer); 



    // Convert locations from Points to Pixels 

    CGFloat scale = self.contentScaleFactor; 

    start.x *= scale; 

    start.y *= scale; 

    end.x *= scale; 

    end.y *= scale; 



    // Allocate vertex array buffer 

    if(vertexBuffer == NULL) 

      vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat)); 



    // Add points to the buffer so there are drawing points every X pixels 

    count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y))/kBrushPixelStep), 1); 

    for(i = 0; i < count; ++i) { 

      if(vertexCount == vertexMax) { 

       vertexMax = 2 * vertexMax; 

       vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat)); 

      } 



      vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i/(GLfloat)count); 

      vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i/(GLfloat)count); 

      vertexCount += 1; 

    } 



    // Render the vertex array 

    glVertexPointer(2, GL_FLOAT, 0, vertexBuffer); 

    glDrawArrays(GL_POINTS, 0, vertexCount); 



    // Display the buffer 

    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); 

    [context presentRenderbuffer:GL_RENDERBUFFER_OES]; 

} 


// Handles the start of a touch 

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    CGRect     bounds = [self bounds]; 

    UITouch*  touch = [[event touchesForView:self] anyObject]; 

    firstTouch = YES; 

    // Convert touch point from UIView referential to OpenGL one (upside-down flip) 

    location = [touch locationInView:self]; 

    location.y = bounds.size.height - location.y; 

} 

// Handles the continuation of a touch. 

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    CGRect     bounds = [self bounds]; 

    UITouch*    touch = [[event touchesForView:self] anyObject]; 



    // Convert touch point from UIView referential to OpenGL one (upside-down flip) 

    if (firstTouch) { 

      firstTouch = NO; 

      previousLocation = [touch previousLocationInView:self]; 

      previousLocation.y = bounds.size.height - previousLocation.y; 

    } else { 

      location = [touch locationInView:self]; 

     location.y = bounds.size.height - location.y; 

      previousLocation = [touch previousLocationInView:self]; 

      previousLocation.y = bounds.size.height - previousLocation.y; 

    } 



    // Render the stroke 

    [self renderLineFromPoint:previousLocation toPoint:location]; 

} 



// Handles the end of a touch event when the touch is a tap. 

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    CGRect     bounds = [self bounds]; 

    UITouch*  touch = [[event touchesForView:self] anyObject]; 

    if (firstTouch) { 

      firstTouch = NO; 

      previousLocation = [touch previousLocationInView:self]; 

      previousLocation.y = bounds.size.height - previousLocation.y; 

      [self renderLineFromPoint:previousLocation toPoint:location]; 

    } 

} 


// Handles the end of a touch event. 

- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event 

{ 

    // If appropriate, add code necessary to save the state of the application. 

    // This application is not saving state. 

} 

回答

1

AFAIK沒有API訪問觸摸區的觸摸。鑑於電容式觸摸屏的限制,我甚至不確定,無論您想要的是物理上的可能。我記得最近在Cocoa Heads上進行了一次演示,演示了一些信息在OS X上(通過私有API)可用於觸控板,但不適用於iOS。

我相信這是一個繪圖板採用的是擁有自己的傳感器技術,內置的專用手寫筆的原因之一。

的部分解決方案,爲繪圖應用程序,可能是模仿「墨」如一些桌面應用程序做:如果用戶的觸摸在特定點中徘徊,就像油墨從「筆」中出來並逐漸擴散通過「紙」一樣。

+0

非常感謝您的回覆。你知道這是否可能在Android上? – HappyAppDeveloper 2012-04-04 17:06:41

+0

我不知道,對不起,我不爲Android開發。不過,我確信其他人會有答案。如果您在搜索中找不到任何內容,我建議發佈標記爲Android的新問題。 – 2012-04-04 17:10:20

1

iPad中的Broadcomm硬件以64 Hz掃描屏幕。它通過在構成觸摸屏電極的39個透明導電層上連續放置400μs的信號來實現。如果您的手指在0.015625秒內移動了超過一個像素距離(很可能),硬件無法檢測到這一點,因爲它正忙於測量屏幕的其他部分以獲取更多觸摸事件。

無論iOS還是Android,這都是一樣的。廉價的Android平板電腦和大屏幕的掃描速度降低,所以它們的觸摸事件間隔更大。

Wacom平板電腦以100Hz運行數字轉換器,所以點的順序會更精細,但仍會錯過觸控筆在兩次測量之間觸摸的像素。