在toneRender中,我用一個頻率和另一個頻率/ 3填充緩衝區。如何清除或不設置kLinearPCMFormatFlagIsNonInterleved
當我運行代碼時,看起來好像讀取的輸出和緩衝區沒有交錯一樣。實際上它是在createToneUnit中設置的。 聲音僅在左揚聲器中播放。當兩個頻率都寫入緩衝器時,兩個音調在左揚聲器中播放。當頻率沒有寫入緩衝器時,例如leftON = 0,它們不被播放。所以緩衝區寫入代碼看起來沒問題。
由於我懷疑我不應該有createToneUnit中的kLinearPCMFormatFlagIsNonInterleaved集,我試圖「清除」該標誌。我幾個小時閱讀文件,但從來沒有找到辦法做到這一點。實驗只會導致應用程序啓動時發生崩潰。
如何清除kLinearPCMFormatFlagIsNonInterleaved?
或者我該如何不設置kLinearPCMFormatFlagIsNonInterleaved在第一個地方? (註釋掉streamFormat.mFormatFlags也會造成崩潰。)
可能有其他一些設置會影響創建交錯播放。
OSStatus RenderTone(
void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
*)inRefCon;
float sampleRate = viewController->sampleRate;
float frequency = viewController->frequency;
// etc.
float theta_increment = 2.0 * M_PI * frequency /sampleRate;
float wave;
float theta2;
float wave2;
float theta_increment2 =0.3 * theta_increment;
const int channel = 0;
Float32 *buffer = (Float32 *)ioData->mBuffers[channel].mData;
for (UInt32 frame = 0; frame < inNumberFrames;)
{
theta += theta_increment;
wave = sin(theta) * playVolume;
theta2 += theta_increment2;
wave2 = sin(theta2) * playVolume;
buffer[frame++] = wave * leftON; // leftON = 1 or 0
buffer[frame++] = wave2 * rightON; // rightON = 1 or 0
if (theta > 2.0 * M_PI)
{
theta -= 2.0 * M_PI;
}
}
// etc.
}
- (void)createToneUnit
{
AudioComponentDescription defaultOutputDescription;
defaultOutputDescription.componentType = kAudioUnitType_Output;
defaultOutputDescription.componentSubType = kAudioUnitSubType_RemoteIO;
defaultOutputDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
defaultOutputDescription.componentFlags = 0;
defaultOutputDescription.componentFlagsMask = 0;
// etc.
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input,
0,
&input,
sizeof(input));
const int four_bytes_per_float = 4;
const int eight_bits_per_byte = 8;
AudioStreamBasicDescription streamFormat;
streamFormat.mSampleRate = sampleRate;
streamFormat.mFormatID = kAudioFormatLinearPCM;
streamFormat.mFormatFlags =
kLinearPCMFormatFlagIsFloat | kLinearPCMFormatFlagIsNonInterleaved;
streamFormat.mBytesPerPacket = four_bytes_per_float;
streamFormat.mFramesPerPacket = 1;
streamFormat.mBytesPerFrame = four_bytes_per_float;
streamFormat.mChannelsPerFrame = 2; // 2= stereo/
streamFormat.mBitsPerChannel = four_bytes_per_float * eight_bits_per_byte;
err = AudioUnitSetProperty (toneUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
0,
&streamFormat,
sizeof(AudioStreamBasicDescription));
}
回答問題100%。這是我問自己爲什麼沒有想到這個優雅的答案之一? – user1251228 2012-03-17 23:10:21