2014-12-21 45 views
0

我嘗試設置紋理CVPixelBuffer,我從視頻流420YpCbCr8到RGB談話

NSDictionary* videoOutputOptions = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] }; 
     self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:videoOutputOptions]; 

得到紋理我想使用。爲此我使用一個片段着色器:

varying lowp vec2 v_texCoord; 
precision mediump float; 

uniform sampler2D SamplerUV; 
uniform sampler2D SamplerY; 
uniform mat3 colorConversionMatrix; 

void main() 
{ 
    mediump vec3 yuv; 
    lowp vec3 rgb; 

    // Subtract constants to map the video range start at 0 
    yuv.x = (texture2D(SamplerY, v_texCoord).r - (16.0/255.0)); 
    yuv.yz = (texture2D(SamplerUV, v_texCoord).rg - vec2(0.5, 0.5)); 

    rgb = colorConversionMatrix * yuv; 

    gl_FragColor = vec4(rgb,1); 
} 

和交談矩陣是

// BT.709, which is the standard for HDTV. 
static const GLfloat kColorConversion709[] = { 
    1.164, 1.164, 1.164, 
    0.0, -0.213, 2.112, 
    1.793, -0.533, 0.0, 
}; 

但作爲結果我得到了greened質感 - 想這意味着我用不正確的談話。 我的結果:

greened image

所以,也儘量konversation矩陣切換到另一個 - here。 也嘗試從this資源的變種。但是,看起來也許它不僅在談話中是問題,而且在片段着色器中也是如此?

任何建議,爲什麼我得到綠色圖像? (問題)。

編輯

這是馬託我如何獲得紋理(使用蘋果樣品AVBasicVideoOutputthis

- (void)displayPixelBuffer:(CVPixelBufferRef)pixelBuffer frameSize:(CGSize)presentationSize 
{ 
    CVReturn err; 
    if (pixelBuffer != NULL) { 
     int frameWidth = (int)CVPixelBufferGetWidth(pixelBuffer); 
     int frameHeight = (int)CVPixelBufferGetHeight(pixelBuffer); 

     if (!_videoTextureCache) { 
      NSLog(@"No video texture cache"); 
      return; 
     } 
     [self cleanUpTextures]; 
     //Use the color attachment of the pixel buffer to determine the appropriate color conversion matrix. 
     CFTypeRef colorAttachments = CVBufferGetAttachment(pixelBuffer, kCVImageBufferYCbCrMatrixKey, NULL); 

     if (colorAttachments == kCVImageBufferYCbCrMatrix_ITU_R_601_4) { 
      _preferredConversion = kColorConversion601; 
     } 
     else { 
      _preferredConversion = kColorConversion709; 
     } 
     //CVOpenGLESTextureCacheCreateTextureFromImage will create GLES texture optimally from CVPixelBufferRef. 
     //Create Y and UV textures from the pixel buffer. These textures will be drawn on the frame buffer Y-plane. 
     glActiveTexture(GL_TEXTURE0); 
     err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE, frameWidth, frameHeight, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &_lumaTexture); 
     if (err) { 
      NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); 
     } 

     glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture)); 
     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 
     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

     // UV-plane. 
     glActiveTexture(GL_TEXTURE1); 
     err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, frameWidth/2, frameHeight/2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &_chromaTexture); 
     if (err) { 
      NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); 
     } 

     glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture)); 
     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 
     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

     glBindFramebuffer(GL_FRAMEBUFFER, _vertexBufferID); 

     CFRelease(pixelBuffer); 
    } 
} 

編輯

如果有人把-1 - 請加評論爲什麼ü做到這一點,什麼是錯的 - 也許對於某個問題是可見和簡單的,但對於其他人 - 不是

+0

注意,在上面的代碼中,有一個與字符串colorAttachments比較恆定kCVImageBufferYCbCrMatrix_ITU_R_601_4一個問題。正確的方法是使用測試(CFStringCompare(colorAttachments,kCVImageBufferYCbCrMatrix_ITU_R_601_4,0)== kCFCompareEqualTo)。 – MoDJ

回答

0

最後 - 我發現我的錯誤 - 愚蠢的錯字:

相反SamplerUV我在代碼統一分配時使用的SamplerUY - 更改 - 和所有的作品完美!

還可以使用一個會話矩陣

1.1643, 0.0000, 1.2802, 
    1.1643, -0.2148, -0.3806, 
    1.1643, 2.1280, 0.0000 

所以如果u有不正確的圖像 - 檢查每一個部件 - 色度和亮度 - 它的一個不正確。

也許有人這個信息將是有益的