我有在Objective-C中運行時創建,配置和啓動視頻捕獲會話的代碼,沒有任何問題。我移植樣品C#和MonoTouch的4.0.3,並有幾個問題,這裏是代碼:使用MonoTouch在iOS中捕獲視頻
void Initialize()
{
// Create notifier delegate class
captureVideoDelegate = new CaptureVideoDelegate(this);
// Create capture session
captureSession = new AVCaptureSession();
captureSession.SessionPreset = AVCaptureSession.Preset640x480;
// Create capture device
captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
// Create capture device input
NSError error;
captureDeviceInput = new AVCaptureDeviceInput(captureDevice, out error);
captureSession.AddInput(captureDeviceInput);
// Create capture device output
captureVideoOutput = new AVCaptureVideoDataOutput();
captureSession.AddOutput(captureVideoOutput);
captureVideoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV32BGRA;
captureVideoOutput.MinFrameDuration = new CMTime(1, 30);
//
// ISSUE 1
// In the original Objective-C code I was creating a dispatch_queue_t object, passing it to
// setSampleBufferDelegate:queue message and worked, here I could not find an equivalent to
// the queue mechanism. Also not sure if the delegate should be used like this).
//
captureVideoOutput.SetSampleBufferDelegatequeue(captureVideoDelegate, ???????);
// Create preview layer
previewLayer = AVCaptureVideoPreviewLayer.FromSession(captureSession);
previewLayer.Orientation = AVCaptureVideoOrientation.LandscapeRight;
//
// ISSUE 2:
// Didn't find any VideoGravity related enumeration in MonoTouch (not sure if string will work)
//
previewLayer.VideoGravity = "AVLayerVideoGravityResizeAspectFill";
previewLayer.Frame = new RectangleF(0, 0, 1024, 768);
this.View.Layer.AddSublayer(previewLayer);
// Start capture session
captureSession.StartRunning();
}
#endregion
public class CaptureVideoDelegate : AVCaptureVideoDataOutputSampleBufferDelegate
{
private VirtualDeckViewController mainViewController;
public CaptureVideoDelegate(VirtualDeckViewController viewController)
{
mainViewController = viewController;
}
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
// TODO: Implement - see: http://go-mono.com/docs/index.aspx?link=T%3aMonoTouch.Foundation.ModelAttribute
}
}
問題1: 不知道如何在SetSampleBufferDelegatequeue方法正確使用委託。還沒有找到一個等效的機制dispatch_queue_t對象,在Objective-C中正常工作以傳遞第二個參數。
問題2: 我在MonoTouch庫中找不到任何VideoGravity枚舉,不知道傳遞一個具有常量值的字符串是否可行。
我有尋找任何線索來解決這個問題,但沒有明確的樣本。任何有關如何在MonoTouch中執行相同操作的示例或信息都將不勝感激。
非常感謝。
的在Init結尾的StartLiveDecoding函數並沒有做太多的工作,只需調用 //啓動視頻捕捉 captureSession.StartRunning(); – 2011-05-10 17:22:30
謝謝,這意味着MonoTouch支持一種解決方案。問題#2的答案在那裏,但仍不知道您的dispatchQueue是如何創建的。我猜avBufferDelegate是一個從委託類下降的類的實例。剩下的問題與dispatchQueue有關。非常感謝Pavel,緩衝區轉換不成問題。 – 2011-05-10 21:53:37
託管創建捕獲會話工作,其中一個問題是嘗試使用SetSampleBufferDelegatequeue而不是SetSampleBufferDelegateAndQueue(不知道有什麼區別)。但是現在我遇到了一個問題,預覽中的圖像會在幾幀後凍結,但是如果我在DidOutputSampleBuffer中放置了斷點,那麼在該斷點中停止執行時,預覽圖層中的圖像會繼續顯示正常。我想它必須做我創建調度隊列的方式。有關如何正確設置調度隊列的任何線索?感謝任何幫助。 – 2011-05-11 13:16:01