我無法解釋iOS中remoteIO audiounit回調的行爲。我正在設置一個帶有兩個回調的remoteIO單元,一個用於輸入回調,另一個用於「渲染」回調。我正在按照與this tasty pixel教程中推薦的類似的方式來執行非常類似的remoteIO設置。這是相當長度的設置方法:核心音頻 - 遠程IO混淆
- (void)setup {
AudioUnit ioUnit;
AudioComponentDescription audioCompDesc;
audioCompDesc.componentType = kAudioUnitType_Output;
audioCompDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioCompDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioCompDesc.componentFlags = 0;
audioCompDesc.componentFlagsMask = 0;
AudioComponent rioComponent = AudioComponentFindNext(NULL, &audioCompDesc);
CheckError(AudioComponentInstanceNew(rioComponent, &ioUnit), "Couldn't get RIO unit instance");
// i/o
UInt32 oneFlag = 1;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Output,
kOutputBus,
&oneFlag,
sizeof(oneFlag)), "Couldn't enable RIO output");
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Input,
kInputBus,
&oneFlag,
sizeof(oneFlag)), "Couldn't enable RIO input");
AudioStreamBasicDescription myASBD;
memset (&myASBD, 0, sizeof(myASBD));
myASBD.mSampleRate = 44100;
myASBD.mFormatID = kAudioFormatLinearPCM;
myASBD.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
myASBD.mFramesPerPacket = 1;
myASBD.mChannelsPerFrame = 1;
myASBD.mBitsPerChannel = 16;
myASBD.mBytesPerPacket = 2 * myASBD.mChannelsPerFrame;
myASBD.mBytesPerFrame = 2 * myASBD.mChannelsPerFrame;
// set stream format for both busses
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
kOutputBus,
&myASBD,
sizeof(myASBD)), "Couldn't set ASBD for RIO on input scope/bus 0");
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus,
&myASBD,
sizeof(myASBD)), "Couldn't set ASBD for RIO on output scope/bus 1");
// set arbitrarily high for now
UInt32 bufferSizeBytes = 10000 * sizeof(int);
int offset = offsetof(AudioBufferList, mBuffers[0]);
int bufferListSizeInBytes = offset + (sizeof(AudioBuffer) * myASBD.mChannelsPerFrame);
// why need to cast to audioBufferList * ?
self.inputBuffer = (AudioBufferList *)malloc(bufferListSizeInBytes);
self.inputBuffer->mNumberBuffers = myASBD.mChannelsPerFrame;
for (UInt32 i = 0; i < myASBD.mChannelsPerFrame; i++) {
self.inputBuffer->mBuffers[i].mNumberChannels = 1;
self.inputBuffer->mBuffers[i].mDataByteSize = bufferSizeBytes;
self.inputBuffer->mBuffers[i].mData = malloc(bufferSizeBytes);
}
self.remoteIOUnit = ioUnit;
/////////////////////////////////////////////// callback setup
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = inputCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)self;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_SetInputCallback,
kAudioUnitScope_Global,
kInputBus,
&callbackStruct,
sizeof(callbackStruct)), "Couldn't set input callback");
AURenderCallbackStruct callbackStruct2;
callbackStruct2.inputProc = playbackCallback;
callbackStruct2.inputProcRefCon = (__bridge void * _Nullable)self;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Global,
kOutputBus,
&callbackStruct,
sizeof(callbackStruct)), "Couldn't set input callback");
CheckError(AudioUnitInitialize(ioUnit), "Couldn't initialize input unit");
CheckError(AudioOutputUnitStart(ioUnit), "AudioOutputUnitStart failed");
}
我在回調中遇到奇怪的行爲。首先,playbackCallback
函數完全不會被調用,儘管它的屬性設置與教程中的屬性相同(本教程由編寫Loopy應用程序的人員編寫)。
其次,輸入回調函數有一個ioData(audioBufferList)參數,該參數應該爲null(根據文檔),但是會在null之間翻轉並且在每個第二個回調中都有非零值。這對任何人都有意義嗎?
此外,在輸入回調中調用audiounitRender
(其中的語義我仍然不瞭解API邏輯和生命週期等),導致-50錯誤,這是非常通用的「壞參數」。這很可能是由於audiobufferlist
的無效「拓撲結構」,即交織/去交織,通道數量等等。但是,我嘗試了各種拓撲,並且沒有導致任何錯誤。而且這也不能解釋奇怪的ioData行爲。下面是引用的函數:
OSStatus inputCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
MicController *myRefCon = (__bridge MicController *)inRefCon;
CheckError(AudioUnitRender(myRefCon.remoteIOUnit,
ioActionFlags,
inTimeStamp,
inBusNumber,
inNumberFrames,
myRefCon.inputBuffer), "audio unit render");
return noErr;
}
我相信我的經驗可能是由於格式或可能使用了錯誤的範圍錯車或其他一些瑣碎的(易使在覈心一些簡單的錯誤音頻上下文錯誤)。然而,因爲我基本上沒有語義和生命週期流程的直覺(方案?我甚至不知道要使用什麼單詞),所以我無法充分調試它。我非常感謝一位更有經驗的核心音頻程序員提供的幫助,這可能會對這種情況有所瞭解。
您的kInputBus和kOutputBus分配了什麼值?您是否將kAudioUnitProperty_ShouldAllocateBuffer屬性設置爲任何值?在啓動RemoteIO之前,您設置了哪些AudioSession或AVAudioSession類別?你是在新的iPhone 6s或6s +還是舊版設備上測試它? – hotpaw2
'kOutputBus'爲零0對方的1.我沒有設置'... shouldAllocateBuffer',並沒有看到我所見過的remoteIO代碼所需要的。我也不知道你需要一個非默認會話。我沒有從AVFoundation獲取默認的AudioSession,並將類別設置爲「PlayAndRecord」,但這沒有幫助。我正在iphone 6+上測試這個。 –
當爲RemoteIO分配自己的AudioBufferList時,我的應用程序將shouldAllocateBuffer設置爲false。你有沒有激活你的AVAudioSession? – hotpaw2