2015-08-21 86 views
0

我一直試圖在我的SpriteKit遊戲中播放音樂,並使用AVAudioPlayerNode類來通過AVAudioPCMBuffer s來播放音樂。每次我導出我的OS X項目時,它都會崩潰,並給我一個關於音頻播放的錯誤。在過去的24小時內,我的頭撞牆後,我決定重新觀看WWDC session 501(見54:17)。我對此問題的解決方案是演示者使用的方法,即將緩衝區的幀分成更小的片段以分解正在讀取的音頻文件。音樂文件AVAudioPCBuffer

NSError *error = nil; 
NSURL *someFileURL = ... 
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error]; 
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L; 
AVAudioFramePosition fileLength = audioFile.length; 

AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity]; 
while (audioFile.framePosition < fileLength) { 
    AVAudioFramePosition readPosition = audioFile.framePosition; 
    if (![audioFile readIntoBuffer: readBuffer error: &error]) 
     return NO; 
    if (readBuffer.frameLength == 0) //end of file reached 
     break; 
} 

我目前的問題是播放器只播放讀入緩衝區的最後一幀。我正在播放的音樂只有2分鐘長。顯然,這太長,不能直接讀入緩衝區。每當在循環內調用readIntoBuffer:方法時,緩衝區是否被覆蓋?我很喜歡這個東西......我怎麼才能讓整個文件播放?

如果我不能得到這個工作,跨多個SKScene s播放音樂(2個不同的文件)的好方法是什麼?

回答

1

這是我想出的解決方案。這仍然不是完美的,但希望它能幫助處於我處於同樣困境的人。我創建了一個單身課程來處理這項工作。將來可以進行的一項改進是隻在需要時加載特定SKScene所需的音效和音樂文件。我有這麼多的代碼問題,我現在不想搞砸了。目前,我沒有太多聲音,所以沒有使用過多的內存。

概述
我的策略是如下:

  1. 存放在plist中對遊戲中的音頻文件名
  2. 閱讀從plist中並創建兩個詞典(一個音樂,一個用於短的音效)
  3. 音效字典由AVAudioPCMBuffer和AVAudioPlayerNode組成,用於每個聲音
  4. 音樂字典構成AVAudioPCMBuffers的陣列,時間戳當這些緩衝器應在隊列被播放用於陣列,一AVAudioPlayerNode和原始音頻文件的採樣率

    • 採樣速率是需要搞清楚在每個緩衝區應發揮的時候(你會看到在代碼中完成計算)
  5. 創建AVAudioEngine,並從發動機主混頻器和連接所有AVAudioPlayerNodes到混頻器(按照慣例)

  6. 使用他們的各種方法
    • 音效播放個
    • 播放聲音效果或音樂是簡單的......呼方法-(void) playSfxFile:(NSString*)file; 並播放聲音
    • 音樂,我只是不能沒有調用找到一個很好的解決方案試圖播放音樂的場景的幫助。場景將調用-(void) playMusicFile:(NSString*)file;,它將安排緩衝區的播放順序,以便它們被創建。在AudioEngine課程中完成後,我無法找到讓音樂重複播放的好方法,因此我決定讓場景檢查其update:方法是否播放某個特定文件的音樂,如果不是,請播放它再次(不是非常光滑的解決方案,但它的工作原理)

AudioEngine.h

#import <Foundation/Foundation.h> 

@interface AudioEngine : NSObject 

+(instancetype)sharedData; 
-(void) playSfxFile:(NSString*)file; 
-(void) playMusicFile:(NSString*)file; 
-(void) pauseMusic:(NSString*)file; 
-(void) unpauseMusic:(NSString*)file; 
-(void) stopMusicFile:(NSString*)file; 
-(void) setVolumePercentages; 
-(bool) isPlayingMusic:(NSString*)file; 

@end 

AudioEngine.m

#import "AudioEngine.h" 
#import <AVFoundation/AVFoundation.h> 
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount) 

@interface AudioEngine() 

@property AVAudioEngine *engine; 
@property AVAudioMixerNode *mixer; 

@property NSMutableDictionary *musicDict; 
@property NSMutableDictionary *sfxDict; 

@property NSString *audioInfoPList; 

@property float musicVolumePercent; 
@property float sfxVolumePercent; 
@property float fadeVolume; 
@property float timerCount; 

@end 

@implementation AudioEngine 

int const FADE_ITERATIONS = 10; 
static NSString * const MUSIC_PLAYER = @"player"; 
static NSString * const MUSIC_BUFFERS = @"buffers"; 
static NSString * const MUSIC_FRAME_POSITIONS = @"framePositions"; 
static NSString * const MUSIC_SAMPLE_RATE = @"sampleRate"; 

static NSString * const SFX_BUFFER = @"buffer"; 
static NSString * const SFX_PLAYER = @"player"; 

+(instancetype) sharedData { 
    static AudioEngine *sharedInstance = nil; 

    static dispatch_once_t onceToken; 
    dispatch_once(&onceToken, ^{ 
     sharedInstance = [[self alloc] init]; 
     [sharedInstance startEngine]; 
    }); 

    return sharedInstance; 
} 

-(instancetype) init { 
    if (self = [super init]) { 
     _engine = [[AVAudioEngine alloc] init]; 
     _mixer = [_engine mainMixerNode]; 

     _audioInfoPList = [[NSBundle mainBundle] pathForResource:@"AudioInfo" ofType:@"plist"]; //open a plist called AudioInfo.plist 

     [self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played 
     [self initMusic]; 
     [self initSfx]; 
    } 
    return self; 
} 

//opens all music files, creates multiple buffers depending on the length of the file and a player 
-(void) initMusic { 
    _musicDict = [NSMutableDictionary dictionary]; 

    _audioInfoPList = [[NSBundle mainBundle] pathForResource: @"AudioInfo" ofType: @"plist"]; 
    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList]; 

    for (NSString *musicFileName in audioInfoData[@"music"]) { 
     [self loadMusicIntoBuffer:musicFileName]; 
     AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init]; 
     [_engine attachNode:player]; 

     AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0]; 
     [_engine connect:player to:_mixer format:buffer.format]; 
     [_musicDict[musicFileName] setObject:player forKey:@"player"]; 
    } 
} 

//opens a music file and creates an array of buffers 
-(void) loadMusicIntoBuffer:(NSString *)filename 
{ 
    NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:@"aif"]; 
    //NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"aif"]]; 
    NSAssert(audioFileURL, @"Error creating URL to audio file"); 
    NSError *error = nil; 
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error]; 
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription); 

    AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file 
    float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file 
    [_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename]; 
    [_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE]; 

    NSMutableArray *buffers = [NSMutableArray array]; 
    NSMutableArray *framePositions = [NSMutableArray array]; 

    const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size 
    while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer 
     [framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]]; 
     AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity]; 
     if (![audioFile readIntoBuffer:readBuffer error:&error]) { 
      NSLog(@"failed to read audio file: %@", error); 
      return; 
     } 
     if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop 
      break; 
     } 
     [buffers addObject:readBuffer]; 
    } 

    [_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS]; 
    [_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS]; 
} 

-(void) initSfx { 
    _sfxDict = [NSMutableDictionary dictionary]; 

    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList]; 

    for (NSString *sfxFileName in audioInfoData[@"sfx"]) { 
     AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init]; 
     [_engine attachNode:player]; 

     [self loadSoundIntoBuffer:sfxFileName]; 
     AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER]; 
     [_engine connect:player to:_mixer format:buffer.format]; 
     [_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER]; 
    } 
} 

//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space 
-(void) loadSoundIntoBuffer:(NSString *)filename 
{ 
    NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"mp3"]]; 
    NSAssert(audioFileURL, @"Error creating URL to audio file"); 
    NSError *error = nil; 
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error]; 
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription); 

    AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length]; 
    [audioFile readIntoBuffer:readBuffer error:&error]; 

    [_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename]; 
    [_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER]; 
} 

-(void)startEngine { 
    [_engine startAndReturnError:nil]; 
} 

-(void) playSfxFile:(NSString*)file { 
    AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:@"player"]; 
    AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER]; 
    [player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil]; 
    [player setVolume:1.0]; 
    [player setVolume:_sfxVolumePercent]; 
    [player play]; 
} 

-(void) playMusicFile:(NSString*)file { 
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER]; 

    if ([player isPlaying] == NO) { 
     NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS]; 

     double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue]; 


     for (int i = 0; i < [buffers count]; i++) { 
      long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue]; 
      AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate]; 

      AVAudioPCMBuffer *buffer = [buffers objectAtIndex:i]; 
      [player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{ 
       if (i == [buffers count] - 1) { 
        [player stop]; 
       } 
      }]; 
      [player setVolume:_musicVolumePercent]; 
      [player play]; 
     } 
    } 
} 

-(void) stopOtherMusicPlayersNotNamed:(NSString*)file { 
    if ([file isEqualToString:@"menuscenemusic"]) { 
     AVAudioPlayerNode *player = [_musicDict[@"levelscenemusic"] objectForKey:MUSIC_PLAYER]; 
     [player stop]; 
    } 
    else { 
     AVAudioPlayerNode *player = [_musicDict[@"menuscenemusic"] objectForKey:MUSIC_PLAYER]; 
     [player stop]; 
    } 
} 

//stops the player for a particular sound 
-(void) stopMusicFile:(NSString*)file { 
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER]; 

    if ([player isPlaying]) { 
     _timerCount = FADE_ITERATIONS; 
     _fadeVolume = _musicVolumePercent; 
     [self fadeOutMusicForPlayer:player]; //fade out the music 
    } 
} 

//helper method for stopMusicFile: 
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player { 
    [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(handleTimer:) userInfo:player repeats:YES]; 
} 

//helper method for stopMusicFile: 
-(void) handleTimer:(NSTimer*)timer { 
    AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo; 
    if (_timerCount > 0) { 
     _timerCount--; 
     AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo; 
     _fadeVolume = _musicVolumePercent * (_timerCount/FADE_ITERATIONS); 
     [player setVolume:_fadeVolume]; 
    } 
    else { 
     [player stop]; 
     [player setVolume:_musicVolumePercent]; 
     [timer invalidate]; 
    } 
} 

-(void) pauseMusic:(NSString*)file { 
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER]; 
    if ([player isPlaying]) { 
     [player pause]; 
    } 
} 

-(void) unpauseMusic:(NSString*)file { 
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER]; 
    [player play]; 
} 

//sets the volume of the player based on user preferences in GameData class 
-(void) setVolumePercentages { 
    NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:@"musicVolume"]; 
    _musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet: 
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]] 
    componentsJoinedByString:@""] floatValue]/100; 
    NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:@"sfxVolume"]; 
    _sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet: 
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]] 
    componentsJoinedByString:@""] floatValue]/100; 

    //immediately sets music to new volume 
    for (NSString *file in [_musicDict allKeys]) { 
     AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER]; 
     [player setVolume:_musicVolumePercent]; 
    } 
} 

-(bool) isPlayingMusic:(NSString *)file { 
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER]; 
    if ([player isPlaying]) 
     return YES; 
    return NO; 
} 

@end