2012-03-05 82 views
4

我使用渲染了iOS 5.0的方法「CVOpenGLESTextureCacheCreateTextureFromImage」的ffmpeg的YUV幀的ffmpeg的OpenGL的YUV視頻。渲染使用CVPixelBufferRef和着色

我使用的是像蘋果的例子GLCameraRipple

我在iPhone屏幕上的結果是這樣的:iPhone Screen

我需要知道我做錯了。

我把我的代碼部分發現的錯誤。

ffmpeg的配置幀:

ctx->p_sws_ctx = sws_getContext(ctx->p_video_ctx->width, 
           ctx->p_video_ctx->height, 
           ctx->p_video_ctx->pix_fmt, 
           ctx->p_video_ctx->width, 
           ctx->p_video_ctx->height, 
           PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL); 


// Framebuffer for RGB data 
ctx->p_frame_buffer = malloc(avpicture_get_size(PIX_FMT_YUV420P, 
               ctx->p_video_ctx->width, 
               ctx->p_video_ctx->height)); 

avpicture_fill((AVPicture*)ctx->p_picture_rgb, ctx->p_frame_buffer,PIX_FMT_YUV420P, 
       ctx->p_video_ctx->width, 
       ctx->p_video_ctx->height); 

我的渲​​染方法:

if (NULL == videoTextureCache) { 
    NSLog(@"displayPixelBuffer error"); 
    return; 
}  


CVPixelBufferRef pixelBuffer;  
    CVPixelBufferCreateWithBytes(kCFAllocatorDefault, mTexW, mTexH, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, buffer, mFrameW * 3, NULL, 0, NULL, &pixelBuffer); 



CVReturn err;  
// Y-plane 
glActiveTexture(GL_TEXTURE0); 
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                videoTextureCache, 
                pixelBuffer, 
                NULL, 
                GL_TEXTURE_2D, 
                GL_RED_EXT, 
                mTexW, 
                mTexH, 
                GL_RED_EXT, 
                GL_UNSIGNED_BYTE, 
                0, 
                &_lumaTexture); 
if (err) 
{ 
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); 
} 

glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture)); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);  

// UV-plane 
glActiveTexture(GL_TEXTURE1); 
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                videoTextureCache, 
                pixelBuffer, 
                NULL, 
                GL_TEXTURE_2D, 
                GL_RG_EXT, 
                mTexW/2, 
                mTexH/2, 
                GL_RG_EXT, 
                GL_UNSIGNED_BYTE, 
                1, 
                &_chromaTexture); 
if (err) 
{ 
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); 
} 

glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture)); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);  

glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer); 

// Set the view port to the entire view 
glViewport(0, 0, backingWidth, backingHeight); 

static const GLfloat squareVertices[] = { 
    1.0f, 1.0f, 
    -1.0f, 1.0f, 
    1.0f, -1.0f, 
    -1.0f, -1.0f, 
}; 

GLfloat textureVertices[] = { 
    1, 1, 
    1, 0, 
    0, 1, 
    0, 0, 
}; 

// Draw the texture on the screen with OpenGL ES 2 
[self renderWithSquareVertices:squareVertices textureVertices:textureVertices]; 


// Flush the CVOpenGLESTexture cache and release the texture 
CVOpenGLESTextureCacheFlush(videoTextureCache, 0);  
CVPixelBufferRelease(pixelBuffer);  

[moviePlayerDelegate bufferDone]; 

RenderWithSquareVertices方法

- (void)renderWithSquareVertices:(const GLfloat*)squareVertices textureVertices:(const GLfloat*)textureVertices 
{ 


    // Use shader program. 
    glUseProgram(shader.program); 

// Update attribute values. 
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices); 
glEnableVertexAttribArray(ATTRIB_VERTEX); 
glVertexAttribPointer(ATTRIB_TEXTUREPOSITON, 2, GL_FLOAT, 0, 0, textureVertices); 
glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON); 

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); 

// Present 
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer); 
[context presentRenderbuffer:GL_RENDERBUFFER]; 

}

我的片段着色器:

uniform sampler2D SamplerY; 
uniform sampler2D SamplerUV; 


varying highp vec2 _texcoord; 

void main() 
{ 

mediump vec3 yuv; 
lowp vec3 rgb; 

yuv.x = texture2D(SamplerY, _texcoord).r; 
yuv.yz = texture2D(SamplerUV, _texcoord).rg - vec2(0.5, 0.5); 

// BT.601, which is the standard for SDTV is provided as a reference 

/* rgb = mat3( 1,  1,  1, 
0, -.34413, 1.772, 
1.402, -.71414,  0) * yuv;*/ 


// Using BT.709 which is the standard for HDTV 
rgb = mat3(  1,  1,  1, 
      0, -.18732, 1.8556, 
      1.57481, -.46813,  0) * yuv; 

    gl_FragColor = vec4(rgb, 1); 

} 

非常感謝,

+0

什麼樣的視頻,你解碼?您是使用FFmpeg的libavcodec還是使用iOS的解碼設備執行視頻解碼? – 2012-03-06 19:26:10

+0

那麼這個應用程序有什麼問題? – karlphillip 2012-03-14 18:50:13

+0

你好,居民,我試圖做同樣的事情,我也有一個綠色的屏幕。你有沒有找到解決問題的辦法?謝謝! – cpprulez 2012-08-01 11:55:02

回答

1

我想象問題是YUV420(或I420)是一個三平面圖像格式。 I420是一個8位Y平面,接着是8位2x2二次採樣的U和V平面。從GLCameraRipple代碼期待NV12格式:8位Y平面,隨後的交錯U/V平面以2x2子採樣。鑑於此,我期望您將需要三個紋理。 luma_tex,u_chroma_tex,v_chroma_tex。

還要注意的是GLCameraRipple也可以期待「的視頻範圍」。換句話說,平面格式的值是luma = [16,235]色度= [16,240]。

+0

你的意思是kCVPixelFormatType_420YpCbCr8BiPlanarFullRange NV12? – onmyway133 2013-06-17 09:19:46