我正在嘗試將CMSampleBufferRef
中的圖像裁剪爲特定尺寸。我想提出5個步驟 - 1.獲取PixelBuffer
從SampleBuffer
2.轉換到裁剪CIImage
4.渲染CIImage
回到PixelBuffer
5.安裝PixelBuffer
到SampleBuffer
。到目前爲止,我有第4步 - 渲染圖像回PixelBuffer
(無法檢查超出此點)的任何東西都呈現緩衝區(我檢查它使用相同CIImage imageWithCVPixelBuffer
並獲得NULL作爲返回)的問題。將不勝感激任何提示或幫助。作物CMSampleBufferRef
CGRect cropRect = CGRectMake(0, 0, 640, 480);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:(CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer)]; //options: [NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]];
ciImage = [ciImage imageByCroppingToRect:cropRect];
CVPixelBufferRef pixelBuffer;
CVPixelBufferCreate(kCFAllocatorSystemDefault, 640, 480, kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
CIContext * ciContext = [CIContext contextWithOptions: nil];
[ciContext render:ciImage toCVPixelBuffer:pixelBuffer];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
CMSampleTimingInfo sampleTime = {
.duration = CMSampleBufferGetDuration(sampleBuffer),
.presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer),
.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
};
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &videoInfo);
CMSampleBufferRef oBuf;
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &sampleTime, &oBuf);
什麼是你的問題有?我在這裏沒有看到問題。 – Aurelius
ciContext不在緩衝區中呈現任何東西。我檢查它與相同的ciimage imageWithCVPixelBuffer並獲得NULL – Laz
嗨。我也卡住了一些喜歡你的地方。 我使用AVFoundation捕捉視頻。我需要實現放大Avcapture委託「didOutputSampleBuffer」。 請告訴我如何縮放和裁剪CMSampleBufferRef/CVImageBufferRef。 –