2012-11-13 90 views
2

當我嘗試生成使用OpenGL紋理我得到了這個方法分割故障:僅與NVIDIA驅動程序的debianSDL,OpenGL的:分割故障

void RendererGL::create_gl(SDL_Surface * surf, GLuint * tex) { 
    GLenum format; GLint colors_amount = surf->format->BytesPerPixel; 

    if (colors_amount == 4) { 
      if (surf->format->Rmask == 0x000000ff) 
        format = GL_RGBA; 
      else 
        format = GL_BGRA; 
    } 
    else if (colors_amount == 3) { 
      if (surf->format->Rmask == 0x000000ff) 
        format = GL_RGB; 
      else 
        format = GL_BGR; 
    } 
    else { 
     gCritical("Image is not truecolor"); 
    } 
    glGenTextures(1, tex); 

    glBindTexture(GL_TEXTURE_2D, *tex); 

    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 

    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

    glTexImage2D(GL_TEXTURE_2D, 0, colors_amount, surf->w, surf->h, 0, format, 
       GL_UNSIGNED_BYTE, surf->pixels); } 

I`ve這個問題。開源驅動應用程序運行正常。 Valgrind的測試回報:

[ Info ] RendererGL::Init GL 
==13033== Thread 3: 
==13033== Invalid read of size 8 
==13033== at 0x51293C9: ??? (in /usr/lib/x86_64-linux-gnu/nvidia/current/libGL.so.304.64) 
==13033== by 0x419B0F: RendererGL::initGL() (RendererGL.cpp:12) 
==13033== by 0x41036C: Renderer::Renderer() (Renderer.cpp:27) 
==13033== by 0x4058EA: Renderer::getInstance() (Renderer.hpp:17) 
==13033== by 0x40B648: StandardReferences::StandardReferences() (StandardReferences.hpp:13) 
==13033== by 0x414C26: Map::Map(short**, unsigned short, unsigned int, unsigned int) (Map.cpp:8) 
==13033== by 0x4136E4: MapManager::loadMapFromFile(std::string, short) (MapManager.cpp:82) 
==13033== by 0x4138A0: MapManager::load() (MapManager.cpp:97) 
==13033== by 0x4084BA: Resource::load() (Resource.cpp:36) 
==13033== by 0x4196AF: Splash::initThread(void*) (Splash.cpp:99) 
==13033== by 0x53D0405: ??? (in /usr/lib/x86_64-linux-gnu/libSDL-1.2.so.0.11.4) 
==13033== by 0x5413898: ??? (in /usr/lib/x86_64-linux-gnu/libSDL-1.2.so.0.11.4) 
==13033== Address 0x658 is not stack'd, malloc'd or (recently) free'd 
==13033== 
==13033== 
==13033== HEAP SUMMARY: 
==13033==  in use at exit: 22,104,451 bytes in 31,729 blocks 
==13033== total heap usage: 47,499 allocs, 15,770 frees, 54,869,707 bytes allocated 
==13033== 
==13033== LEAK SUMMARY: 
==13033== definitely lost: 833 bytes in 8 blocks 
==13033== indirectly lost: 1,728 bytes in 38 blocks 
==13033==  possibly lost: 269,732 bytes in 99 blocks 
==13033== still reachable: 21,832,158 bytes in 31,584 blocks 
==13033==   suppressed: 0 bytes in 0 blocks 
==13033== Rerun with --leak-check=full to see details of leaked memory 
==13033== 
==13033== For counts of detected and suppressed errors, rerun with: -v 
==13033== Use --track-origins=yes to see where uninitialised values come from 
==13033== ERROR SUMMARY: 12 errors from 2 contexts (suppressed: 6 from 6) 
+0

您通過變量「color_amount」提供的glTexImage2D的第三個參數不會採用任意值。通過規範,它必須是在glTexImage2D引用中找到的格式標記之一;然而使用無效的令牌不應該導致段錯誤。 – datenwolf

+0

嘗試添加到您的主要開始:'setenv(「MESA_DEBUG」,「」,0);'。它將啓用來自Mesa/OpenGL的調試/警告消息(將它們寫入stdout/stderr)。我通常只是將它包裝在'#ifdef DEBUG'中,並將其包含在每個GL程序中。 – QuasarDonkey

+0

我能想到的唯一的事情就是你的紋理可能具有非二維度量,而你的nvidia卡不支持它。我只是猜測。 – QuasarDonkey

回答