我想用我的代碼同時錄製視頻和抓取幀。可以同時使用AVCaptureVideoDataOutput和AVCaptureMovieFileOutput嗎?
我使用AVCaptureVideoDataOutput
作爲抓幀,使用AVCaptureMovieFileOutput
作視頻錄製。但是在工作的同時卻不能工作並得到錯誤代碼-12780,而是個人。
我搜索了這個問題,但沒有得到答案。有沒有人有相同的經歷或解釋? 這真的讓我困擾了一會兒。
謝謝。
我想用我的代碼同時錄製視頻和抓取幀。可以同時使用AVCaptureVideoDataOutput和AVCaptureMovieFileOutput嗎?
我使用AVCaptureVideoDataOutput
作爲抓幀,使用AVCaptureMovieFileOutput
作視頻錄製。但是在工作的同時卻不能工作並得到錯誤代碼-12780,而是個人。
我搜索了這個問題,但沒有得到答案。有沒有人有相同的經歷或解釋? 這真的讓我困擾了一會兒。
謝謝。
我不能回答的具體問題看跌,但使用我已經成功地錄製視頻,抓住在同一時間框架:
AVCaptureSession
和AVCaptureVideoDataOutput
路由幀到我自己的代碼AVAssetWriter
,AVAssetWriterInput
和AVAssetWriterInputPixelBufferAdaptor
寫入幀出來的H.264編碼的電影文件這還沒有調查的音頻。我最終從捕獲會話獲得CMSampleBuffers
,然後將它們推入像素緩衝適配器。
編輯:所以我的代碼看起來多少有點像,與您遇到任何問題與掠過,而忽略的範圍問題位:
/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;
AVCaptureDeviceInput *deviceInput = input with device as above,
and attach it to the session;
AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
delegate and a suitable dispatch queue affixed.
/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
/* I'm going to push pixel buffers to it, so will need a
AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:assetWriterInput
sourcePixelBufferAttributes:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];
/* that's going to go somewhere, I imagine you've got the URL for that sorted,
so create a suitable asset writer; we'll put our H.264 within the normal
MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
initWithURL:URLFromSomwhere
fileType:AVFileTypeMPEG4
error:you need to check error conditions,
this example is too lazy];
[assetWriter addInput:assetWriterInput];
/* we need to warn the input to expect real time data incoming, so that it tries
to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;
... eventually ...
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];
... elsewhere ...
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
static int64_t frameNumber = 0;
if(assetWriterInput.readyForMoreMediaData)
[pixelBufferAdaptor appendPixelBuffer:imageBuffer
withPresentationTime:CMTimeMake(frameNumber, 25)];
frameNumber++;
}
... and, to stop, ensuring the output file is finished properly ...
[captureSession stopRunning];
[assetWriter finishWriting];
你介意請發表一個示例代碼如何做到這一點?你的真實生活業力會增加10倍! :D – SpaceDog 2011-02-13 17:58:24
這是湯米的回答迅速版。
// Set up the Capture Session
// Add the Inputs
// Add the Outputs
var outputSettings = [
AVVideoWidthKey : Int(640),
AVVideoHeightKey : Int(480),
AVVideoCodecKey : .h264
]
var assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,outputSettings: outputSettings)
var pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput, sourcePixelBufferAttributes:
[ kCVPixelBufferPixelFormatTypeKey : Int(kCVPixelFormatType_32BGRA)])
var assetWriter = AVAssetWriter(url: URLFromSomwhere, fileType: AVFileTypeMPEG4 , error : Error)
assetWriter.addInput(assetWriterInput)
assetWriterInput.expectsMediaDataInRealTime = true
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
captureSession.startRunning()
func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
var frameNumber: Int64 = 0
if assetWriterInput.readyForMoreMediaData {
pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: CMTimeMake(frameNumber, 25))
}
frameNumber += 1 }
captureSession.stopRunning()
assetWriter.finishWriting()
雖然我並沒有保證100%的準確性,因爲我是新來的迅速。
_「視頻可以直接捕獲到AVCaptureMovieFileOutput文件中,但是這個類沒有可顯示的數據,而**不能與AVCaptureVideoDataOutput同時使用。」#這裏找到了:[link](https:// developer.xamarin.com/api/type/MonoTouch.AVFoundation.AVCaptureSession/)..只是爲了澄清問題的實際原因 – Csharpest 2017-03-29 10:03:16