2013-05-29 91 views
-2

所以,我正在寫一個raytracer,我有一個基本的類來表示視口。它看起來像這樣:我的雙打發生了什麼事?

camera.hpp:

#define real double 

class Camera 
{ 
    // self explanatory 
    Vector3 origin, destination, up; 

    // camera's coordinate system 
    // N = normal to projection plane ("z-axis") 
    // U, V = x- and y-axes of projection plane 
    Vector3 N, U, V; 

    // Directional increment vectors for screen-space X and Y 
    Vector3 xInc, yInc; 

    // vFOV derived from hFOV 
    real horizontalFOV, verticalFOV; 

    // width and height of projection buffer in pixels 
    real width, height; 

public: 
    // Constructs a camera 
    // pO = point of origin (i.e., where rays are emitted) 
    // pLA = point looked at, (i.e., pLA - pO = direction looked at) 
    // vUp = Up vector relative to camera's position 
    // hFOV = Horizontal field-of-view in degrees. Vertical is calculated based on 
    //  aspect ratio of viewport. 
    // vW = width of viewport in pixels 
    // vH = height of viewport in pixels 
    Camera(Vector3 pO, Vector3 pLA, Vector3 vUp, 
      real hFOV, real vW,  real vH); 

    // Constructs a ray from pO looking in the direction of the 
    // specified pixel on the viewport. Note that floating values 
    // can be given, in case we want to do supersampling. 
    Ray RayForPixel(const real x, const real y); 

    // Prints this camera's information to stdout. 
    void dump(); 
}; 

camera.cpp:

#include "raytracer.hpp" 

Camera::Camera(Vector3 pO, Vector3 pLA, Vector3 vUp, real hFOV, real vW, real vH) : 
    origin(pO), destination(pLA), up(vUp), horizontalFOV(hFOV), width(vW), height(vH) 
{ 
    // Non-square aspect ratios are not yet supported! 
    assert(width == height); 

    N = (pLA - pO); 
    normalize(N); 

    U = cross(N, vUp); 
    normalize(U); 

    V = cross(U, N); 
    normalize(V); 

    real aspectRatio = (width/height); 
    verticalFOV  = (horizontalFOV/aspectRatio); 

    // TODO: verify this. 
    real hFov2 = horizontalFOV/2; 
    real vFov2 = verticalFOV/2; 

    // TODO: implement non-square aspect ratios 
    xInc = -U * ((2.0 * tan(DEG2RAD(vFov2)))/width); 
    yInc = -V * ((2.0 * tan(DEG2RAD(vFov2)))/height); 
} 

Ray Camera::RayForPixel(const real x, const real y) 
{ 
    Vector3 direction = N + 
     (yInc * (0.5 * ((2.0 * y) - 1.0 - height))) + 
     (xInc * (0.5 * ((2.0 * x) - 1.0 - width))); 

    printf("<%f, %f, %f> ", xInc.x, xInc.y, xInc.z); 
    normalize(direction); 

    return Ray(origin, direction); 
} 

void Camera::dump() 
{ 
    Vector3 rfp00 = RayForPixel((real)0, (real)0).direction; 
    Vector3 rfp01 = RayForPixel((real)0, height - 1).direction; 
    Vector3 rfp10 = RayForPixel(width - 1, (real)0).direction; 
    Vector3 rfp11 = RayForPixel(width - 1, height - 1).direction; 

    Vector3 rfp50 = RayForPixel((width/2) - 1, (real)0).direction; 
    Vector3 rfp05 = RayForPixel((real)0, (height/2) - 1).direction; 
    Vector3 rfp55 = RayForPixel((width/2) - 1, (height/2) - 1).direction; 

    printf(
    "Camera at <%+7.5f, %+7.5f, %+7.5f>\n" 
     " hFOV = %+7.5f, vFOV = %+7.5f\n" 
     " xInc = <%+7.5f, %+7.5f, %+7.5f>\n" 
     " yInc = <%+7.5f, %+7.5f, %+7.5f>\n" 
     " Coordinate system:\n" 
     " N: <%+7.5f, %+7.5f, %+7.5f>\n" 
     " U: <%+7.5f, %+7.5f, %+7.5f>\n" 
     " V: <%+7.5f, %+7.5f, %+7.5f>\n" 
     " Rays for screen-space extents:\n" 
     " < 0, 0> -> <%+7.5f, %+7.5f, %+7.5f>\n" 
     " < 0, 1> -> <%+7.5f, %+7.5f, %+7.5f>\n" 
     " < 1, 0> -> <%+7.5f, %+7.5f, %+7.5f>\n" 
     " < 1, 1> -> <%+7.5f, %+7.5f, %+7.5f>\n" 
     " <.5, 0> -> <%+7.5f, %+7.5f, %+7.5f>\n" 
     " < 0, .5> -> <%+7.5f, %+7.5f, %+7.5f>\n" 
     " <.5, .5> -> <%+7.5f, %+7.5f, %+7.5f>\n", 
     origin.x, origin.y, origin.z, 
     horizontalFOV, verticalFOV, 
     xInc.x, xInc.y, xInc.z, 
     yInc.x, yInc.y, yInc.z, 
     N.x, N.y, N.z, 
     U.x, U.y, U.z, 
     V.x, V.y, V.z, 
     rfp00.x, rfp00.y, rfp00.z, 
     rfp01.x, rfp01.y, rfp01.z, 
     rfp10.x, rfp10.y, rfp10.z, 
     rfp11.x, rfp11.y, rfp11.z, 
     rfp50.x, rfp50.y, rfp50.z, 
     rfp05.x, rfp05.y, rfp05.z, 
     rfp55.x, rfp55.y, rfp55.z); 
} 

我創建像這樣,並呼籲dump()

Camera camera = Camera(
     Vector3(0, 0, -5), // Look from <0, 0, -5> 
     Vector3(0, 0, 0), // Look at world-space origin 
     Vector3(0, 1, 0), // Camera is XZ axis-aligned. 
     70,    // 70 degree horizontal FoV 
     OUT_W, OUT_H); 

raytracer.SetCamera(&camera); 

// later, in raytracer::render: 
camera->dump(); 

我得到這個輸出:

Camera at <+0.00292, +0.99996, +0.00292> 
    hFOV = +0.02527, vFOV = -0.00891 
    xInc = <+0.01170, -0.00878, +0.00292> 
    yInc = <+0.99996, +0.00292, +0.99964> 
    Coordinate system: 
    N: <+0.99996, +0.00292, -0.00875> 
    U: <+0.00292, +0.99996, +0.00292> 
    V: <+0.99996, +0.00292, +0.99989> 
    Rays for screen-space extents: 
    < 0, 0> -> <-0.00292, +0.00292, +0.99999> 
    < 0, 1> -> <-0.00292, +0.00875, +0.99996> 
    < 1, 0> -> <-0.00875, +0.00292, +0.99996> 
    < 1, 1> -> <-0.00875, +0.00875, +0.99992> 
    <.5, 0> -> <-0.00875, +0.00292, +0.99996> 
    < 0, .5> -> <+0.99989, +0.01170, -0.00878> 
    <.5, .5> -> <+0.99964, +0.02527, -0.00891> 

我完全不知道爲什麼會發生這種情況,而且這已成爲過去3天內WTF的主要來源。

編輯:我應該實際解釋了我的問題,對不起D :(有點在這個問題上強調過,並沒有停下來認爲不是每個人都知道這個代碼進出:P)。在我的代碼中,我設置hFOV = vFOV,但輸出不對應於此。我的雙重算術的其餘部分也出現了失誤。

邁克爾多爾指出,相機可能是一個堆棧var超出範圍,這就是它!

+0

這段代碼是什麼樣的:raytracer.SetCamera(&camera);另外,想要查看轉儲之前設置後相機會發生什麼。 –

+3

所以我們需要弄清楚你不喜歡那個輸出? –

+0

'void SetCamera(Camera * cam){camera = cam; }' –

回答

6

相機(可能)是一個堆棧var,你正在保存它。它是否超出範圍? :)