2012-04-17 70 views
2

我想在iPhone上創建增強現實視圖。作爲一個起點,我看了一下蘋果的pARk演示項目。但是,在那裏,deviceMotion屬性用於獲取旋轉矩陣以進行相機轉換。但是由於deviceMotion使用陀螺儀(可用於iPhone 4及更新版本),我也想支持3GS(事實上,3GS是我唯一的開發設備),所以我不能使用這種方法。所以我想使用加速度計和指南針提供的數據自己創建旋轉矩陣。如何在無陀螺儀的設備上創建CMRotationMatrix

不幸的是,我缺乏這樣做的數學技能。仔細查看,在我看來this是我的問題最相關的實際操作指南,但在實施後,似乎並不適應我的問題(POI視圖只是暫時出現,並且看起來更像是由於設備移動而不是它的標題;我已經發布了我的onDisplayLink方法(具有重大變化的唯一方法))。我試着去閱讀相關的數學,但在這一點上,我只是不知道如何找到一種方法,或者在我的代碼中找到錯誤。請幫忙嗎?

編輯︰我已經認識到,傳感器數據應該更好地存儲在雙打比整數,並添加了一點平滑。現在我可以更清楚地看到設備旋轉時應該從側面出現的POI如何從上而下。也許這有助於指出什麼是錯的。

CMAccelerometerData* orientation = motionManager.accelerometerData; 
CMAcceleration acceleration = orientation.acceleration; 

vec4f_t normalizedAccelerometer; 
vec4f_t normalizedMagnetometer; 

xG = (acceleration.x * kFilteringFactor) + (xG * (1.0 - kFilteringFactor)); 
yG = (acceleration.y * kFilteringFactor) + (yG * (1.0 - kFilteringFactor)); 
zG = (acceleration.z * kFilteringFactor) + (zG * (1.0 - kFilteringFactor)); 

xB = (heading.x * kFilteringFactor) + (xB * (1.0 - kFilteringFactor)); 
yB = (heading.y * kFilteringFactor) + (yB * (1.0 - kFilteringFactor)); 
zB = (heading.z * kFilteringFactor) + (zB * (1.0 - kFilteringFactor)); 

double accelerometerMagnitude = sqrt(pow(xG, 2) + pow(yG, 2) + pow(zG, 2)); 
double magnetometerMagnitude = sqrt(pow(xB, 2) + pow(yB, 2) + pow(zB, 2)); 

normalizedAccelerometer[0] = xG/accelerometerMagnitude; 
normalizedAccelerometer[1] = yG/accelerometerMagnitude; 
normalizedAccelerometer[2] = zG/accelerometerMagnitude; 
normalizedAccelerometer[3] = 1.0f; 

normalizedMagnetometer[0] = xB/magnetometerMagnitude; 
normalizedMagnetometer[1] = yB/magnetometerMagnitude; 
normalizedMagnetometer[2] = zB/magnetometerMagnitude; 
normalizedMagnetometer[3] = 1.0f; 

vec4f_t eastDirection; 

eastDirection[0] = normalizedAccelerometer[1] * normalizedMagnetometer[2] - normalizedAccelerometer[2] * normalizedMagnetometer[1]; 
eastDirection[1] = normalizedAccelerometer[0] * normalizedMagnetometer[2] - normalizedAccelerometer[2] * normalizedMagnetometer[0]; 
eastDirection[2] = normalizedAccelerometer[0] * normalizedMagnetometer[1] - normalizedAccelerometer[1] * normalizedMagnetometer[0]; 
eastDirection[3] = 1.0f; 

double eastDirectionMagnitude = sqrt(pow(eastDirection[0], 2) + pow(eastDirection[1], 2) + pow(eastDirection[2], 2)); 

vec4f_t normalizedEastDirection; 

normalizedEastDirection[0] = eastDirection[0]/eastDirectionMagnitude; 
normalizedEastDirection[1] = eastDirection[1]/eastDirectionMagnitude; 
normalizedEastDirection[2] = eastDirection[2]/eastDirectionMagnitude; 
normalizedEastDirection[3] = 1.0f; 

vec4f_t northDirection; 

northDirection[0] = (pow(normalizedAccelerometer[0], 2) + pow(normalizedAccelerometer[1],2) + pow(normalizedAccelerometer[2],2)) * xB - (normalizedAccelerometer[0] * xB + normalizedAccelerometer[1] * yB + normalizedAccelerometer[2] * zB)*normalizedAccelerometer[0]; 
northDirection[1] = (pow(normalizedAccelerometer[0], 2) + pow(normalizedAccelerometer[1],2) + pow(normalizedAccelerometer[2],2)) * yB - (normalizedAccelerometer[0] * xB + normalizedAccelerometer[1] * yB + normalizedAccelerometer[2] * zB)*normalizedAccelerometer[1]; 
northDirection[2] = (pow(normalizedAccelerometer[0], 2) + pow(normalizedAccelerometer[1],2) + pow(normalizedAccelerometer[2],2)) * zB - (normalizedAccelerometer[0] * xB + normalizedAccelerometer[1] * yB + normalizedAccelerometer[2] * zB)*normalizedAccelerometer[2]; 
northDirection[3] = 1.0f; 

double northDirectionMagnitude; 

northDirectionMagnitude = sqrt(pow(northDirection[0], 2) + pow(northDirection[1], 2) + pow(northDirection[2], 2)); 

vec4f_t normalizedNorthDirection; 

normalizedNorthDirection[0] = northDirection[0]/northDirectionMagnitude; 
normalizedNorthDirection[1] = northDirection[1]/northDirectionMagnitude; 
normalizedNorthDirection[2] = northDirection[2]/northDirectionMagnitude; 
normalizedNorthDirection[3] = 1.0f; 

CMRotationMatrix r; 
r.m11 = normalizedEastDirection[0]; 
r.m21 = normalizedEastDirection[1]; 
r.m31 = normalizedEastDirection[2]; 
r.m12 = normalizedNorthDirection[0]; 
r.m22 = normalizedNorthDirection[1]; 
r.m32 = normalizedNorthDirection[2]; 
r.m13 = normalizedAccelerometer[0]; 
r.m23 = normalizedAccelerometer[1]; 
r.m33 = normalizedAccelerometer[2]; 

transformFromCMRotationMatrix(cameraTransform, &r); 

[self setNeedsDisplay]; 

當設備被放置在桌面上,並粗略地(使用Compass.app)指向北方,我記錄此數據:

Accelerometer: x: -0.016692, y: 0.060852, z: -0.998007 
Magnetometer: x: -0.016099, y: 0.256711, z: -0.966354 
North Direction x: 0.011472, y: 8.561041, z:0.521807 
Normalized North Direction x: 0.001338, y: 0.998147, z:0.060838 
East Direction x: 0.197395, y: 0.000063, z:-0.003305 
Normalized East Direction x: 0.999860, y: 0.000319, z:-0.016742 

不會出現理智?

編輯2:我已經更新了r的賦值,這顯然會導致我在達到目標的一半時:當設備直立時,我現在可以看到水平面附近的地標;然而,他們距離他們預期的位置大約有90度時鐘。此外,通過貝塔建議運動後的輸出:

Accelerometer: x: 0.074289, y: -0.997192, z: -0.009475 
Magnetometer: x: 0.031341, y: -0.986382, z: -0.161458 
North Direction x: -1.428996, y: -0.057306, z:-5.172881 
Normalized North Direction x: -0.266259, y: -0.010678, z:-0.963842 
East Direction x: 0.151658, y: -0.011698, z:-0.042025 
Normalized East Direction x: 0.961034, y: -0.074126, z:-0.266305 
+0

我沒有辦法測試你的代碼,但我看到了一些機會。你可以驗證'eastDirection'和'northDirection'是否按照你的意圖工作? – Beta 2012-04-17 23:52:05

+0

我不確定,所以我在上面的編輯中添加了一些記錄的數據。這有幫助嗎? – mss 2012-04-18 10:52:46

+0

看起來不錯;北是+ Y,東是+ X。您應該驗證當您擡起北邊緣,圍繞南邊旋轉90度時,East仍然是+ X,北是-Z。現在,您想要對旋轉矩陣做什麼,以及如何知道它何時工作? – Beta 2012-04-18 17:00:09

回答

2

得到一個iPhone 4的保持之後,我能夠通過代碼生成的數據進行比較上面的CoreMotion姿態數據的輸出。有了這個,我發現我應該按以下方式將值賦給我的旋轉矩陣:

CMRotationMatrix r; 
r.m11 = normalizedNorthDirection[0]; 
r.m21 = normalizedNorthDirection[1]; 
r.m31 = normalizedNorthDirection[2]; 
r.m12 = 0 - normalizedEastDirection[0]; 
r.m22 = normalizedEastDirection[1]; 
r.m32 = 0 - normalizedEastDirection[2]; 
r.m13 = 0 - normalizedAccelerometer[0]; 
r.m23 = 0 - normalizedAccelerometer[1]; 
r.m33 = 0 - normalizedAccelerometer[2]; 

這給了大致相似的價值觀,當然由CoreMotion產生的數據使用陀螺儀要好得多。無論如何,這是合理支持3GS的起點。也許可以通過某種過濾來獲得額外的質量,但我還沒有決定是否值得這樣做。