2014-01-07 148 views
11

我想在我的全屏渲染輸出應用CoreImage過濾器,但看起來像我想的東西,因爲我越來越黑屏輸出。應用CIFilter到OpenGL渲染到紋理

首先我畫整個場景的紋理。然後我創建了CoreImage,然後我最終繪製並呈現該紋理。但我得到的只是黑屏。我是在圖紙上以質地和與OpenGLES整合CoreImage下面的Apple引導線:WWDC2012 511和https://developer.apple.com/library/ios/documentation/3ddrawing/conceptual/opengles_programmingguide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html

下面是相關代碼:

渲染:

@interface Renderer() { 
    EAGLContext* _context; 
    GLuint _defaultFramebuffer, _drawFramebuffer, _depthRenderbuffer, _colorRenderbuffer, _drawTexture; 
    GLint _backingWidth, _backingHeight; 
    CIImage *_coreImage; 
    CIFilter *_coreFilter; 
    CIContext *_coreContext; 
} 

初始化方法:

- (BOOL)initOpenGL 
{ 
    _context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
    if (!_context) return NO; 

    [EAGLContext setCurrentContext:_context]; 

    glGenFramebuffers(1, &_defaultFramebuffer); 
    glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer); 

    glGenRenderbuffers(1, &_colorRenderbuffer); 
    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer); 
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorRenderbuffer); 

    glGenFramebuffers(1, &_drawFramebuffer); 
    glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer); 

    glGenTextures(1, &_drawTexture); 
    glBindTexture(GL_TEXTURE_2D, _drawTexture); 
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, _drawTexture, 0); 

    glGenRenderbuffers(1, &_depthRenderbuffer); 
    glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer); 
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthRenderbuffer); 

    _coreFilter = [CIFilter filterWithName:@"CIColorInvert"]; 
    [_coreFilter setDefaults]; 

    NSDictionary *opts = @{ kCIContextWorkingColorSpace : [NSNull null] }; 
    _coreContext = [CIContext contextWithEAGLContext:_context options:opts]; 

    return YES; 
} 

的Alloc存儲器每當層尺寸的變化(上init和上取向變化):

- (void)resizeFromLayer:(CAEAGLLayer *)layer 
{ 
    layer.contentsScale = 1; 

    glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer); 

    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer); 
    [_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:layer]; 

    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &_backingWidth); 
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &_backingHeight); 

    // glCheckFramebufferStatus ... SUCCESS 

    glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer); 

    glBindTexture(GL_TEXTURE_2D, _drawTexture); 
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, _backingWidth, _backingHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); 

    glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer); 
    glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, _backingWidth, _backingHeight); 

    // glCheckFramebufferStatus ... SUCCESS 
} 

繪製方法:

- (void)render:(Scene *)scene 
{ 
    [EAGLContext setCurrentContext:_context]; 

    glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer); 

    // Draw using GLKit, custom shaders, drawArrays, drawElements 
    // Now rendered scene is in _drawTexture 

    glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer); 
    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer); 

    // Create CIImage with our render-to-texture texture 
    _coreImage = [CIImage imageWithTexture:_drawTexture size:CGSizeMake(_backingWidth, _backingHeight) flipped:NO colorSpace:nil]; 

    // Ignore filtering for now; Draw CIImage to current render buffer 
    [_coreContext drawImage:_coreImage inRect:CGRectMake(0, 0, _backingWidth, _backingHeight) fromRect:CGRectMake(0, 0, _backingWidth, _backingHeight)]; 

    // Present 
    [_context presentRenderbuffer:GL_RENDERBUFFER]; 
} 

注意,繪製現場後,_drawTexture包含渲染場景。我使用Xcode調試工具(Capture OpenGL ES框架)檢查了這一點。

編輯:如果我嘗試創建CIImage一些其它質地然後_drawTexture的,它正確地顯示。我懷疑是_drawTexture可能沒有準備好,或者在CIContext嘗試通過CIImage呈現它時被鎖定。

EDIT2:我也嘗試過只用視清更換所有的繪圖代碼:

glViewport(0, 0, _backingWidth, _backingHeight); 
    glClearColor(0, 0.8, 0, 1); 
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); 

結果仍是黑色的。它表明問題可能與繪製紋理或幀緩衝區有關。

+0

爲什麼地球上這會被降低? –

+1

這很有趣。通常情況下,我會說這是因爲Core Image上下文不是使用與渲染相同的OpenGL ES上下文創建的,而是看起來在這裏可以正確設置。如果使用直通着色器並使用渲染的紋理在屏幕上繪製四邊形,您是否可以驗證場景是否正確渲染到紋理?最後,如果你沒有和Core Image結婚,我在這裏有一個小項目:https://github.com/BradLarson/GPUImage也可以做這種GPU端的後期處理。請參閱那裏的CubeExample示例應用程序,它可以做到您想要的。 –

+0

好吧,我使用_drawTexture渲染了一個四邊形,它是黑色的。因此,看起來像是該紋理或渲染的東西是錯誤的。也許我錯過了渲染到紋理的東西。差異只是我將紋理附加爲GL_COLOR_ATTACHEMNT0而不是渲染緩衝區。 –

回答

5

我終於找到什麼是錯的。 2個紋理iOS上的非權力必須具有線性過濾和鉗位到邊包裝:

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

我的紋理具有相同大小的屏幕,但我並沒有設置這四個PARAMS。

對於後代:代碼以上是完全的OpenGL ES和CoreImage的互連的有效實例。只要確保你正確初始化你的紋理!