2011-07-31 28 views
5

我使用SlimDX,針對DirectX 11與着色器模型4.我有一個像素着色「preProc」,它處理我的頂點並保存三個數據紋理。一個用於每像素法線,一個用於每像素位置數據,一個用於顏色和深度(顏色佔用rgb,深度佔用alpha通道)。多個渲染目標不保存數據

然後,我後來在後處理着色器中使用這些紋理來實現屏幕空間環境光遮擋,但似乎沒有數據在第一個着色器中保存。

這裏是我的像素着色器:

​​

其輸出以下結構:

struct PS_OUT 
{ 
    float4 col : SV_TARGET0; 
    float4 norm : SV_TARGET1; 
    float4 pos : SV_TARGET2; 
}; 

,並採取下面的結構輸入:

struct PS_IN 
{ 
    float4 pos : SV_POSITION; 
    float2 tex : TEXCOORD0; 
    float3 norm : TEXCOORD1; 
}; 

但是在我的後處理着色器:

Texture2D renderTex : register(t1); 
Texture2D normalTex : register(t2); 
Texture2D positionTex : register(t3); 
Texture2D randomTex : register(t4); 
SamplerState samLinear : register(s0); 

float4 PS(PS_IN input) : SV_Target 
{ 
    return float4(getCol(input.tex)); 
} 

它只是輸出一個淺藍色的屏幕(我在每幀開始時重置渲染目標的顏色)。只有在處理一個渲染目標時,getCol已經過測試才能工作,並從renderTex材質返回顏色。如果我將pixelshader改爲採樣randomTex紋理(我的代碼以前從文件加載並且不是渲染目標),則所有渲染效果都很好,因此我相信這不是我的後期處理着色器。

在情況下,它是一個的失敗在這裏我slimDX代碼是我做的:

創建我的紋理,shaderresourvecviews和rendertargetviews:

Texture2DDescription textureDescription = new Texture2DDescription() 
       { 
        Width=texWidth, 
        Height=texHeight, 
        MipLevels=1, 
        ArraySize=3, 
        Format=SlimDX.DXGI.Format.R32G32B32A32_Float, 
        SampleDescription = new SlimDX.DXGI.SampleDescription(1,0), 
        BindFlags = BindFlags.RenderTarget | BindFlags.ShaderResource, 
        CpuAccessFlags= CpuAccessFlags.None, 
        OptionFlags = ResourceOptionFlags.None, 
        Usage= ResourceUsage.Default, 
       }; 
      texture = new Texture2D(device, textureDescription); 

      renderTargetView = new RenderTargetView[3]; 
      shaderResourceView = new ShaderResourceView[3]; 

      for (int i = 0; i < 3; i++) 
      { 
       RenderTargetViewDescription renderTargetViewDescription = new RenderTargetViewDescription() 
        { 
         Format = textureDescription.Format, 
         Dimension = RenderTargetViewDimension.Texture2D, 
         MipSlice = 0, 
        }; 

       renderTargetView[i] = new RenderTargetView(device, texture, renderTargetViewDescription); 

       ShaderResourceViewDescription shaderResourceViewDescription = new ShaderResourceViewDescription() 
        { 
         Format = textureDescription.Format, 
         Dimension = ShaderResourceViewDimension.Texture2D, 
         MostDetailedMip = 0, 
         MipLevels = 1 
        }; 

       shaderResourceView[i] = new ShaderResourceView(device, texture, shaderResourceViewDescription); 
      } 

渲染我多渲染目標:

private void renderToTexture(Shader shader) 
    { 
     //set the vertex and pixel shaders 
     context.VertexShader.Set(shader.VertexShader); 
     context.PixelShader.Set(shader.PixelShader); 

     //send texture data and a linear sampler to the shader 
     context.PixelShader.SetShaderResource(texture, 0); 
     context.PixelShader.SetSampler(samplerState, 0); 

     //set the input assembler 
     SetInputAssembler(shader); 

     //reset the camera's constant buffer 
     camera.ResetConstantBuffer(); 

     //set the render targets to the textures we will render to 
     context.OutputMerger.SetTargets(depthStencilView, renderTargetViews); 
     //clear the render targets and depth stencil 
     foreach (RenderTargetView view in renderTargetViews) 
     { 
      context.ClearRenderTargetView(view, color); 
     } 
     context.ClearDepthStencilView(depthStencilView, DepthStencilClearFlags.Depth, 1.0f, 0); 
      //draw the scene 
      DrawScene(); 

     } 

然後當我將後處理着色器渲染到屏幕時的功能:

private void renderTexture(Shader shader) 
     { 
      //get a single quad to be the screen we render 
      Mesh mesh = CreateScreenFace(); 
      //set vertex and pixel shaders 
      context.VertexShader.Set(shader.VertexShader); 
      context.PixelShader.Set(shader.PixelShader); 
      //set the input assembler 
      SetInputAssembler(shader); 
      //point the render target to the screen 
      context.OutputMerger.SetTargets(depthStencil, renderTarget); 
      //send the rendered textures and a linear sampler to the shader 
       context.PixelShader.SetShaderResource(renderTargetViews[0], 1); 
      context.PixelShader.SetShaderResource(renderTargetViews[1], 2); 
      context.PixelShader.SetShaderResource(renderTargetViews[2], 3); 
      context.PixelShader.SetShaderResource(random, 4); 
      context.PixelShader.SetSampler(samplerState, 0); 
      //clear the render targets and depth stencils 
      context.ClearRenderTargetView(renderTarget, new Color4(0.52734375f, 0.8046875f, 0.9765625f)); 
      context.ClearDepthStencilView(depthStencil, DepthStencilClearFlags.Depth, 1, 0); 
      //set the vertex and index buffers from the quad 
      context.InputAssembler.SetVertexBuffers(0, new VertexBufferBinding(mesh.VertexBuffer, Marshal.SizeOf(typeof(Vertex)), 0)); 
      context.InputAssembler.SetIndexBuffer(mesh.IndexBuffer, Format.R16_UInt, 0); 
      //draw the quad 
      context.DrawIndexed(mesh.indices, 0, 0); 
      //dispose of the buffers 
      mesh.VertexBuffer.Dispose(); 
      mesh.IndexBuffer.Dispose(); 
     } 

編輯:我已經添加了PIX函數調用輸出當前運行的單個幀:

Frame 40 
//setup 
<0x06BDA1D8> ID3D11DeviceContext::ClearRenderTargetView(0x06B66190, 0x0028F068) 
<0x06BDA1D8> ID3D11DeviceContext::ClearDepthStencilView(0x06B66138, 1, 1.000f, 0) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0028F010, 0x0028EFF8, 0x0028F00C --> 0x06BF8EE0) 
CreateObject(D3D11 Buffer, 0x06BF8EE0) 
<0x06BDA1D8> ID3D11DeviceContext::PSSetConstantBuffers(0, 1, 0x0028F084 --> { 0x06BF8EE0 }) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0F8DEB58, 0x0F8DEB40, 0x0F8DEB54 --> 0x06BF8F68) 
CreateObject(D3D11 Buffer, 0x06BF8F68) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0F70EAD8, 0x0F70EAC0, 0x0F70EAD4 --> 0x06BF8FF0) 
CreateObject(D3D11 Buffer, 0x06BF8FF0) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0FAAE9A8, 0x0FAAE990, 0x0FAAE9A4 --> 0x06BF9078) 
CreateObject(D3D11 Buffer, 0x06BF9078) 
<0x0059FF78> ID3D11Device::GetImmediateContext(0x06BDA1D8 --> 0x5BA8A8D8) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0F8DEB58, 0x0F8DEB40, 0x0F8DEB54 --> 0x06BF9100) 
CreateObject(D3D11 Buffer, 0x06BF9100) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0F70EAD8, 0x0F70EAC0, 0x0F70EAD4 --> 0x06BF9188) 
CreateObject(D3D11 Buffer, 0x06BF9188) 
<0x06BDA1D8> ID3D11DeviceContext::Release() 
<0x06BDA1D8> ID3D11DeviceContext::UpdateSubresource(0x06B59270, 0, NULL, 0x06287FA0, 0, 0) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0FAAE9A8, 0x0FAAE990, 0x0FAAE9A4 --> 0x06BF9210) 
CreateObject(D3D11 Buffer, 0x06BF9210) 
<0x06BDA1D8> ID3D11DeviceContext::VSSetShader(0x06B66298, NULL, 0) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0FC0E978, 0x0FC0E960, 0x0FC0E974 --> 0x06BF9298) 
CreateObject(D3D11 Buffer, 0x06BF9298) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0FE8EDE8, 0x0FE8EDD0, 0x0FE8EDE4 --> 0x06BF9320) 
CreateObject(D3D11 Buffer, 0x06BF9320) 
<0x06BDA1D8> ID3D11DeviceContext::PSSetShader(0x06B666F8, NULL, 0) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0FC0E978, 0x0FC0E960, 0x0FC0E974 --> 0x06BF93A8) 
CreateObject(D3D11 Buffer, 0x06BF93A8) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0FE8EDE8, 0x0FE8EDD0, 0x0FE8EDE4 --> 0x06BF9430) 
CreateObject(D3D11 Buffer, 0x06BF9430) 
<0x0059FF78> ID3D11Device::CreateInputLayout(0x0028EBE0, 3, 0x06286CB8, 152, 0x0028EBD8 --> 0x06BF9D68) 
CreateObject(D3D11 Input Layout, 0x06BF9D68) 
<0x06BDA1D8> ID3D11DeviceContext::IASetInputLayout(0x06BF9D68) 
<0x06BDA1D8> ID3D11DeviceContext::IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST) 
<0x0059FF78> ID3D11Device::GetImmediateContext(0x06BDA1D8 --> 0x5BA8A8D8) 
<0x06BDA1D8> ID3D11DeviceContext::Release() 
<0x06BDA1D8> ID3D11DeviceContext::VSSetConstantBuffers(0, 1, 0x0028F024 --> { 0x06B59270 }) 
<0x06BDA1D8> ID3D11DeviceContext::OMSetRenderTargets(3, 0x0028F004 --> { 0x06B65708, 0x06B657B8, 0x06B582E0 }, 0x06B66138) 
<0x06BDA1D8> ID3D11DeviceContext::ClearRenderTargetView(0x06B65708, 0x0028EFEC) 
<0x06BDA1D8> ID3D11DeviceContext::ClearRenderTargetView(0x06B657B8, 0x0028EFEC) 
<0x06BDA1D8> ID3D11DeviceContext::ClearRenderTargetView(0x06B582E0, 0x0028EFEC) 
<0x06BDA1D8> ID3D11DeviceContext::ClearDepthStencilView(0x06B66138, 1, 1.000f, 0) 
//draw scene for preproc shader (this should output the three render targets) 
//DRAW CALLS HIDDEN 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0028EE04, 0x0028EDEC, 0x0028EE00 --> 0x06BF94B8) 
CreateObject(D3D11 Buffer, 0x06BF94B8) 
<0x0059FF78> ID3D11Device::CreateBuffer(0x0028EE04, 0x0028EDEC, 0x0028EE00 --> 0x06BF9540) 
CreateObject(D3D11 Buffer, 0x06BF9540) 
<0x06BDA1D8> ID3D11DeviceContext::VSSetShader(0x06B66BB8, NULL, 0) 
<0x06BDA1D8> ID3D11DeviceContext::PSSetShader(0x06B66E50, NULL, 0) 
<0x0059FF78> ID3D11Device::CreateInputLayout(0x0028EB64, 3, 0x05E988E0, 120, 0x0028EB5C --> 0x06BF9E28) 
CreateObject(D3D11 Input Layout, 0x06BF9E28) 
<0x06BDA1D8> ID3D11DeviceContext::IASetInputLayout(0x06BF9E28) 
<0x06BDA1D8> ID3D11DeviceContext::IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST) 
<0x06BDA1D8> ID3D11DeviceContext::OMSetRenderTargets(1, 0x0028EFC0 --> { 0x06B66190 }, 0x06B66138) 
<0x06BDA1D8> ID3D11DeviceContext::PSSetShaderResources(1, 3, 0x0028EF3C --> { 0x06B65760, 0x06B58288, 0x06B58338 }) 
<0x06BDA1D8> ID3D11DeviceContext::PSSetShaderResources(4, 1, 0x0028EFC0 --> { 0x06B66FA0 }) 
<0x06BDA1D8> ID3D11DeviceContext::ClearRenderTargetView(0x06B66190, 0x0028EFA4) 
<0x06BDA1D8> ID3D11DeviceContext::ClearDepthStencilView(0x06B66138, 1, 1.000f, 0) 
<0x06BDA1D8> ID3D11DeviceContext::IASetVertexBuffers(0, 1, 0x0028EFAC --> { 0x06BF94B8 }, 0x0028EFB0, 0x0028EFB4) 
<0x06BDA1D8> ID3D11DeviceContext::IASetIndexBuffer(0x06BF9540, DXGI_FORMAT_R16_UINT, 0) 
//draw quad for post proc shader. This shader takes the three textures in, as well as a random texture, which is added in the second PSSetShaderResources call. The random texture outputs fine. 
<0x06BDA1D8> ID3D11DeviceContext::DrawIndexed(6, 0, 0) 
<0x06BF94B8> ID3D11Buffer::Release() 
<0x06BF9540> ID3D11Buffer::Release() 
<0x06B65B00> IDXGISwapChain::Present(0, 0) 

EDIT2:我一直在做一些閱讀,也許我在我將它們作爲ShaderResourceViews傳遞給我的postProcess着色器之前,需要在preProc傳遞之後將紋理解除分配爲渲染目標。我假定調用context.OutputMerger.SetTargets()將取消分配所有當前分配的渲染目標,然後只分配函數參數中指定的渲染目標。如果情況並非如此(我還不能確定它是否是),那麼我將如何去取消分配SlimDX中的渲染目標?

EDIT3:嗯,根據本MSDN Page,呼籲OutputMerger.SetRenderTargets()「將覆蓋所有有界渲染目標和深度模具目標無論ppRenderTargetViews渲染目標的數量。」因此當我告訴OutputMerger渲染到屏幕時,所有渲染目標都會自動釋放。這讓我回到原點。

回答

1

通過發現我是多麼愚蠢來修復它。

當我創建我的rendertarget時,我創建了一個Texture2DArray,但我把它當作一個Texture2D對象而不是一個對象。自那之後,我改變了我的代碼來使用一組Texture2D對象,並且它工作得很好。