0
現在我運行AVDepthPhotoFilter
,即從iPhone7Plus的立體相機渲染深度參數。深度數據 - 獲取每像素深度數據(CVPixelBuffer數據分析)
所以,我想訪問每像素深度數據,但是,我不知道該怎麼做。請指教。
現在我運行AVDepthPhotoFilter
,即從iPhone7Plus的立體相機渲染深度參數。深度數據 - 獲取每像素深度數據(CVPixelBuffer數據分析)
所以,我想訪問每像素深度數據,但是,我不知道該怎麼做。請指教。
如何獲得DepthData和分析CVPixelBuffer數據
你需要確保你的AVCapturePhotoSettings()已isDepthDataDeliveryEnabled =真
你必須使用的函數FUNC photoOutput(_輸出:AVCapturePhotoOutput,didFinishProcessingPhoto photo:AVCapturePhoto,error:Error?)
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
//## Convert Disparity to Depth ##
let depthData = (photo.depthData as AVDepthData!).converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)
let depthDataMap = depthData.depthDataMap //AVDepthData -> CVPixelBuffer
//## Data Analysis ##
// Useful data
let width = CVPixelBufferGetWidth(depthDataMap) //768 on an iPhone 7+
let height = CVPixelBufferGetHeight(depthDataMap) //576 on an iPhone 7+
CVPixelBufferLockBaseAddress(depthDataMap, CVPixelBufferLockFlags(rawValue: 0))
// Convert the base address to a safe pointer of the appropriate type
let floatBuffer = unsafeBitCast(CVPixelBufferGetBaseAddress(depthDataMap), to: UnsafeMutablePointer<Float32>.self)
// Read the data (returns value of type Float)
// Accessible values : (width-1) * (height-1) = 767 * 575
let distanceAtXYPoint = floatBuffer[Int(x * y)]
}
如果您想了解CVPixelBuffer分析更多的信息,這裏是一個有用的帖子 - >details
幫了我很多。 – Eyzuky