2014-04-16 31 views
1

我在我的ios應用程序中錄製了一些視頻,有時(很不可預知),它在錄製時與EXC_BAD_ACCESS KERN_INVALID_ADDRESS一起崩潰。 (編輯:項目使用ARC)EXC_BAD_ACCESS KERN_INVALID_ADDRESS在我的ios應用程序中

Thread : Crashed: com.myapp.myapp 
0 libobjc.A.dylib    0x3b1cc622 objc_msgSend + 1 
1 com.myapp.myap     0x00156faf -[Encoder encodeFrame:isVideo:] (Encoder.m:129) 
2 com.myapp.myap     0x001342ab -[CameraController  captureOutput:didOutputSampleBuffer:fromConnection:] (CameraController.m:423) 
3 AVFoundation     0x2f918327 __74-[AVCaptureAudioDataOutput _AVCaptureAudioDataOutput_AudioDataBecameReady]_block_invoke + 282 
4 libdispatch.dylib    0x3b6abd53 _dispatch_call_block_and_release + 10 
5 libdispatch.dylib    0x3b6b0cbd _dispatch_queue_drain + 488 
6 libdispatch.dylib    0x3b6adc6f _dispatch_queue_invoke + 42 
7 libdispatch.dylib    0x3b6b15f1 _dispatch_root_queue_drain + 76 
8 libdispatch.dylib    0x3b6b18dd _dispatch_worker_thread2 + 56 
9 libsystem_pthread.dylib  0x3b7dcc17 _pthread_wqthread + 298 

聲明我的變量:

@interface CameraController () <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate> 
{ 
AVCaptureSession* _session; 
AVCaptureVideoPreviewLayer* _preview; 
dispatch_queue_t _captureQueue; 
AVCaptureConnection* _audioConnection; 
AVCaptureConnection* _videoConnection; 


Encoder* _encoder; 
BOOL _isRecording; 
BOOL _isPaused; 
BOOL _discont; 
int _currentFile; 
CMTime _timeOffset; 
CMTime _lastVideo; 
CMTime _lastAudio; 

int _cx; 
int _cy; 
int _channels; 
Float64 _samplerate; 
} 
@end 

,這裏是[編碼器encodeFrame:isVideo:](在ntrace 1號線)的情況下:

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
BOOL bVideo = YES; 

@synchronized(self) 
{ 
    if (!self.isCapturing || self.isPaused) 
    { 
     return; 
    } 
    if (connection != _videoConnection) 
    { 
     bVideo = NO; 
    } 
    if ((_encoder == nil) && !bVideo) 
    { 
     CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer); 
     [self setAudioFormat:fmt]; 
     NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile]; 
     NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename]; 
     _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate]; 
    } 
    if (_discont) 
    { 
     if (bVideo) 
     { 
      return; 
     } 
     _discont = NO; 
     // calc adjustment 
     CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
     CMTime last = bVideo ? _lastVideo : _lastAudio; 
     if (last.flags & kCMTimeFlags_Valid) 
     { 
      if (_timeOffset.flags & kCMTimeFlags_Valid) 
      { 
       pts = CMTimeSubtract(pts, _timeOffset); 
      } 
      CMTime offset = CMTimeSubtract(pts, last); 
      NSLog(@"Setting offset from %s", bVideo?"video": "audio"); 
      NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale)); 

      // this stops us having to set a scale for _timeOffset before we see the first video time 
      if (_timeOffset.value == 0) 
      { 
       _timeOffset = offset; 
      } 
      else 
      { 
       _timeOffset = CMTimeAdd(_timeOffset, offset); 
      } 
     } 
     _lastVideo.flags = 0; 
     _lastAudio.flags = 0; 
    } 

    // retain so that we can release either this or modified one 
    CFRetain(sampleBuffer); 

    if (_timeOffset.value > 0) 
    { 
     CFRelease(sampleBuffer); 
     sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset]; 
    } 

    // record most recent time so we know the length of the pause 
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer); 
    if (dur.value > 0) 
    { 
     pts = CMTimeAdd(pts, dur); 
    } 
    if (bVideo) 
    { 
     _lastVideo = pts; 
    } 
    else 
    { 
     _lastAudio = pts; 
    } 
} 

// pass frame to encoder 
[_encoder encodeFrame:sampleBuffer isVideo:bVideo]; //This is line 129 
CFRelease(sampleBuffer); 
} 

完整使用的代碼請參閱:http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html - 我已經使用此控件進行視頻錄製。 我知道很難幫助解決這類問題,但我應該在哪裏開始調試此問題?感謝您的幫助

+1

谷歌「如何調試EXC_BAD_ACCESS」,你會發現大量的資源,例如http://www.raywenderlich.com/10209/my-app-crashed-now-what-part-1 –

+1

這是Encoder.m的129行嗎?您可能會引用已經被釋放的對象。 (objc_ msgSend錯誤的常見原因。) –

+0

。 // pss幀到編碼器 [_encoder encodeFrame:sampleBuffer isVideo:bVideo]; //這是第129行 CFRelease(sampleBuffer); –

回答

1

在你的方法,你有以下...

CFRetain(sampleBuffer); 

if (_timeOffset.value > 0) 
{ 
    CFRelease(sampleBuffer); 
    sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset]; 
} 

那麼,在年底你有另一個

CFRelease(sampleBuffer); 

在情況_timeOffset.value大於0 AREN你放過嗎?或者你在其他地方做了retain?您是否應該在if區塊內再次保留它?

+0

不,我不保留在其他地方。可能會導致這個問題。我添加了CFRetain(sampleBuffer);在if塊的末尾。感謝您的建議 –

+1

@martin讓我們知道它是否確實修復了 – Flexicoder

+0

當然,我會對我的代碼進行更改,並且需要對其進行正確測試。一天之內馬克我會讓你知道。感謝您的幫助和關注 –

相關問題