2013-10-17 41 views
3

我想用Audiounits做音頻文件的TimeStretching。如何使用AudioUnit在ios中保存音頻文件?

我正在使用此代碼。 http://pastebin.com/DWMTw4n9

下面是我使用的示例項目:https://dl.dropbox.com/u/12216224/buglets/TimeSliderDemo-Buglet.zip

如何將音頻文件保存在IOS AudioUnits?

有什麼我想:

試圖保存音頻通過

OSStatus MyAURenderCallback(void *inRefCon, 
         AudioUnitRenderActionFlags *actionFlags, 
         const AudioTimeStamp *inTimeStamp, 
         UInt32 inBusNumber, 
         UInt32 inNumberFrames, 
         AudioBufferList *ioData) { 

AudioUnit mixerUnit = (AudioUnit)inRefCon; 

AudioUnitRender(mixerUnit, 
       actionFlags, 
       inTimeStamp, 
       0, 
       inNumberFrames, 
       ioData); 

//Store the Audio units in a File 
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); 
NSString *outputURL = paths[0]; 
NSFileManager *manager = [NSFileManager defaultManager]; 
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil]; 
outputURL = [outputURL stringByAppendingPathComponent:@"outputRamp.aif"]; 

ExtAudioFileWriteAsync((__bridge ExtAudioFileRef)(outputURL), 
         inNumberFrames, 
         ioData); 

return noErr; 
} 


AURenderCallbackStruct callbackStruct = {0}; 
callbackStruct.inputProc = MyAURenderCallback; 
// callbackStruct.inputProcRefCon = mixerUnit; 
callbackStruct.inputProcRefCon = (__bridge void *)self; 

AudioUnitSetProperty(self.effectUnit, 
        kAudioUnitProperty_SetRenderCallback, 
        kAudioUnitScope_Input, 
        0, 
        &callbackStruct, 
        sizeof(callbackStruct)); 

但回撥永遠不會被調用。將Audiounits保存在文件中是否正確?

回答

2

你應該設置輸入回調函數節點`

AURenderCallbackStruct inputCallbackStruct; 
inputCallbackStruct.inputProc = &MyAURenderCallback; 
inputCallbackStruct.inputProcRefCon = (__bridge void *)(self); 


OSStatus result = noErr; 


// Attach the render callback function to remoteIO's input on bus 0 
result = AUGraphSetNodeInputCallback (
             self.auGraph, 
             ioNode, 
             0, 
             &inputCallbackStruct 
            ); 


CheckError(result, "AUGraphSetNodeInputCallback"); 

看到示例項目Audio MixerHow to add a Render Callback to RemoteIO after a Mixer in iOS

你應該創建ExtAudioFileRef recordingfileref

AudioStreamBasicDescription dstFormat; 
dstFormat.mSampleRate=44100.0; 
dstFormat.mFormatID=kAudioFormatLinearPCM; 
dstFormat.mFormatFlags=kAudioFormatFlagsNativeEndian|kAudioFormatFlagIsSignedInteger|kAudioFormatFlagIsPacked; 

dstFormat.mBytesPerPacket=4; 
dstFormat.mBytesPerFrame=4; 
dstFormat.mFramesPerPacket=1; 
dstFormat.mChannelsPerFrame=2; 
dstFormat.mBitsPerChannel=16; 
dstFormat.mReserved=0; 

// create the capture file 
status= ExtAudioFileCreateWithURL((__bridge CFURLRef)(_outputURL), kAudioFileWAVEType, &dstFormat, NULL, kAudioFileFlags_EraseFile, &_recordingfileref); 
//CheckError(status ,"couldnt create audio file"); 
// set the capture file's client format to be the canonical format from the queue 




status=ExtAudioFileSetProperty(self.recordingfileref, kExtAudioFileProperty_ClientDataFormat, sizeof(AudioStreamBasicDescription), &StreamFormat); 

,然後你可以iodata

ExtAudioFileWriteAsync(recordingfileref, 
        inNumberFrames, 
        ioData); 

當您完成

OSStatus status = ExtAudioFileDispose(_recordingfileref); 

printf("OSStatus(ExtAudioFileDispose): %ld\n", status); 
+0

@ sheraza感謝您的答覆。我已按照提供的步驟..但在ExtAudioFileWriteAsync編寫音頻文件,我得到錯誤響應(-50)。任何想法來解決這個問題? –

+0

下面是我嘗試過的代碼:http://pastebin.com/k3wCthzV –

相關問題