2012-06-26 141 views
1

我已經減少了這個問題,希望得到一些幫助。用AVAssetWriter錄製和播放音頻

基本上這個類有兩種方法,一種是開始錄製音頻(-recordMode),另一種是播放音頻(playMode)。我目前在一個項目中擁有一個帶有兩個按鈕的單個視圖控制器,這些按鈕調用相應的方法(rec,play)。沒有其他變量,該類是自包含的。

但它不會播放任何東西,我不明白爲什麼。當我嘗試播放文件時,我得到的文件大小爲0,並且出現錯誤,因爲您無法使用nil引用來初始化AVAudioPlayer。但我不明白爲什麼該文件是空的或爲什麼self.outputPathnil

.h文件中

#import <AVFoundation/AVFoundation.h> 

@interface MicCommunicator : NSObject<AVCaptureAudioDataOutputSampleBufferDelegate> 

@property(nonatomic,retain) NSURL *outputPath; 
@property(nonatomic,retain) AVCaptureSession * captureSession; 
@property(nonatomic,retain) AVCaptureAudioDataOutput * output; 

-(void)beginStreaming; 
-(void)playMode; 
-(void)recordMode; 

@end 

.m文件:

@implementation MicCommunicator { 
    AVAssetWriter *assetWriter; 
    AVAssetWriterInput *assetWriterInput; 
} 

@synthesize captureSession = _captureSession; 
@synthesize output = _output; 
@synthesize outputPath = _outputPath; 

-(id)init { 
    if ((self = [super init])) { 
     NSArray *searchPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); 
     self.outputPath = [NSURL fileURLWithPath:[[searchPaths objectAtIndex:0] stringByAppendingPathComponent:@"micOutput.output"]]; 

     AudioChannelLayout acl; 
     bzero(&acl, sizeof(acl)); 
     acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; //kAudioChannelLayoutTag_Stereo; 
     NSDictionary *audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithInt: kAudioFormatULaw],AVFormatIDKey,   
              [NSNumber numberWithFloat:8000.0],AVSampleRateKey,//was 44100.0 
              [NSData dataWithBytes: &acl length: sizeof(AudioChannelLayout) ], AVChannelLayoutKey, 
              [NSNumber numberWithInt:1],AVNumberOfChannelsKey, 
              [NSNumber numberWithInt:8000.0],AVEncoderBitRateKey, 
              nil]; 

     assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings]; 
     [assetWriterInput setExpectsMediaDataInRealTime:YES]; 

     assetWriter = [[AVAssetWriter assetWriterWithURL:_outputPath fileType:AVFileTypeWAVE error:nil] retain]; 
     [assetWriter addInput:assetWriterInput]; 
    } 
    return self; 
} 

-(void)dealloc { 
    [assetWriter release]; 
    [super dealloc]; 
} 

//conveniance methods 

-(void)playMode 
{ 
    [self stopRecording]; 

    NSError *error; 
    AVAudioPlayer * audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:self.outputPath error:&error]; 
    audioPlayer.numberOfLoops = -1; 

    if (audioPlayer == nil){ 
     NSLog(@"error: %@",[error description]);   
    }else{ 
     NSLog(@"playing"); 
     [audioPlayer play]; 
    } 
} 

-(void)recordMode 
{ 
     [self beginStreaming];  
} 

-(void)stopRecording 
{ 
    [self.captureSession stopRunning]; 
    [assetWriterInput markAsFinished]; 
    [assetWriter finishWriting]; 

    NSDictionary *outputFileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[NSString stringWithFormat:@"%@",self.outputPath] error:nil]; 
    NSLog (@"done. file size is %llu", [outputFileAttributes fileSize]); 
} 

//starts audio recording 
-(void)beginStreaming { 
    self.captureSession = [[AVCaptureSession alloc] init]; 
    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; 
    NSError *error = nil; 
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error]; 
    if (audioInput) 
     [self.captureSession addInput:audioInput]; 
    else { 
     NSLog(@"No audio input found."); 
     return; 
    } 

    AVCaptureAudioDataOutput *output = [[AVCaptureAudioDataOutput alloc] init]; 

    dispatch_queue_t outputQueue = dispatch_queue_create("micOutputDispatchQueue", NULL); 
    [output setSampleBufferDelegate:self queue:outputQueue]; 
    dispatch_release(outputQueue); 

    [self.captureSession addOutput:output]; 
    [assetWriter startWriting]; 
    [self.captureSession startRunning]; 
} 

//callback 
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { 
    AudioBufferList audioBufferList; 
    NSMutableData *data= [[NSMutableData alloc] init]; 
    CMBlockBufferRef blockBuffer; 
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer); 

    //for (int y = 0; y < audioBufferList.mNumberBuffers; y++) { 
    // AudioBuffer audioBuffer = audioBufferList.mBuffers[y]; 
    // Float32 *frame = (Float32*)audioBuffer.mData; 
    //   
    // [data appendBytes:frame length:audioBuffer.mDataByteSize]; 
    //} 

    // append [data bytes] to your NSOutputStream 


    // These two lines write to disk, you may not need this, just providing an example 
    [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)]; 
    [assetWriterInput appendSampleBuffer:sampleBuffer]; 

    CFRelease(blockBuffer); 
    blockBuffer=NULL; 
    [data release]; 
} 

@end 
+0

請通過代碼再次,如果一個調用返回的一些錯誤信息,如'BOOL'狀態或'NSError',檢查或記錄它。這會讓你更接近問題的根源。你的回調是否被調用? – zoul

+0

是的,我確實知道這一點,因爲我使用它將音頻數據發送到另一個類來傳輸音頻數據。它會一直被調用。就像我說的那樣是一種愚蠢的版本,所以我刪除了它。無論如何,我會通過這一點,並添加一些像你說的東西,謝謝你的建議 –

+0

如果我把NSLog(@「緩衝區數%d」,y);在for循環內的回調其始終爲0.意味着mNumberBuffers爲0以及 –

回答

0

每蘋果技術支持:

因此,這是錯誤 - 文件被創建,採樣的數目是寫成功然後追加開始失敗,原因不明。

看來,AVAssetWriter只有使用這些設置纔會失敗。

AudioQueue就是應該用於ULAW音頻

+0

這是蘋果公司修復的錯誤嗎? – TigerCoding

+0

真的不確定沒有辦法讓我在這麼晚的日子檢查。如果這些東西真的給你帶來麻煩,建議可能是使用蘋果的一個免費開發支持案例 –