2015-11-17 22 views
1

我需要在設備開始輸出音頻時立即發起操作。我正在使用AVPlayer,並從Parse流式傳輸音頻文件。使用諸如等待(AVPlayer.currentTime()!= nil)和(AVPlayer.rate> 0)等其他方法不夠準確,我需要準確知道從設備輸出音頻的時間。我嘗試過使用AVAudioEngine,然後附上一個AVAudioNode,它有一個AVAudioNodeBus,但我無法讓它工作。任何建議或技術將是非常好的謝謝你!如何監控設備的音頻輸出以便能夠判斷聲音是否從揚聲器/耳機插孔中傳出?

這裏是我的AudioEngine代碼。我在實例級別實例化AudioEngine。當創建standardFormat時,我不知道爲什麼使用standardFormatWithSampleRate或通道塊。當我嘗試安裝TapOnBus時,我不知道如何使用該塊,所以我將nil,但也會觸發一個錯誤。任何幫助,將不勝感激,我對iOS開發非常新,並通過蘋果的文檔多次閱讀,但我不能包裝我的頭,我無法找到任何最新的例子在線。

class TableViewController: UITableViewController, AVAudioPlayerDelegate { 

var iDArray = [String]() 
var NameArray = [String]() 


var durationInSeconds = Double() 

var currentSong = String() 




override func viewDidLoad() { 
    super.viewDidLoad() 



    let ObjectIDQuery = PFQuery(className: "Songs") 
    ObjectIDQuery.findObjectsInBackgroundWithBlock { 
     (objectsArray: [PFObject]?, error: NSError?) -> Void in 

     //objectsArray!.count != 0 
     var objectIDs = objectsArray! 

     for i in 0...objectIDs.count-1 { 
       self.iDArray.append(objectIDs[i].valueForKey("objectId") as! String) 
       self.NameArray.append(objectIDs[i].valueForKey("SongName") as! String) 

       self.tableView.reloadData() 
      } 

    } 

    do { 
     try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback) 
     print("AVAudioSession Category Playback OK") 
     do { 
      try AVAudioSession.sharedInstance().setActive(true) 
      print("AVAudioSession is Active") 
     } catch let error as NSError { 
      print(error.localizedDescription) 
     } 
    } catch let error as NSError { 
     print(error.localizedDescription) 
    } 



} 

func grabSong() { 


    let songQuery = PFQuery(className: "Songs") 
    songQuery.getObjectInBackgroundWithId(iDArray[SelectedSongNumber], block: { 
     (object: PFObject?, error : NSError?) -> Void in 


     if let audioFile = object?["SongFile"] as? PFFile { 
      let audioFileUrlString: String = audioFile.url! 
      let audioFileUrl = NSURL(string: audioFileUrlString)! 


      AudioPlayer = AVPlayer(URL: audioFileUrl) 
      AudioPlayer.play() 

    }) 

} 

func audioFunction() { 

    var audioPlayerNode = AVAudioNode() 
    var audioBus = AVAudioNodeBus() 


    var standardFormat = AVAudioFormat(standardFormatWithSampleRate: <#T##Double#>, channels: <#T##AVAudioChannelCount#>) 


    AudioEngine.attachNode(audioPlayerNode) 

    audioPlayerNode.outputFormatForBus(audioBus) 

    audioPlayerNode.installTapOnBus(audioBus, bufferSize: 100, format: standardFormat, block: nil) 

    if AudioEngine.running == true { 
     print("the audio engine is running") 
    } else { 
     print("the audio engine is NOTTT running") 
    } 

} 


func attachNode(audioNode : AVAudioNode) { 
    AudioEngine.attachNode(audioNode) 

    AudioEngine.outputNode 
    print(AudioEngine.outputNode.description) 

    if AudioEngine.running == true { 
     print("the audio engine is running") 
    } else { 
     print("the audio engine is NOTTT running") 
    } 
} 

override func tableView(tableView: UITableView, numberOfRowsInSection section: Int) -> Int { 

    return iDArray.count 
} 


override func tableView(tableView: UITableView, cellForRowAtIndexPath indexPath: NSIndexPath) -> UITableViewCell { 
    let cell = tableView.dequeueReusableCellWithIdentifier("Cell") 
    cell?.textLabel!.text = NameArray[indexPath.row] 

    return cell! 
} 



override func tableView(tableView: UITableView, didSelectRowAtIndexPath indexPath: NSIndexPath) { 

    SelectedSongNumber = indexPath.row 
    grabSong() 
} 

}

我應該使用AVAudioSession呢?還是AVCaptureSession?

+0

我很想試着給一個'AVPlayer'添加一個水龍頭。你爲什麼不顯示你的'AVAudioEngine'代碼?我很好奇爲什麼這不起作用。 –

+0

對不起,剛添加它 – MikeG

+0

你是怎麼用AVAudioEngine播放音頻文件的? –

回答

1

我會在AVPlayer上使用音頻水龍頭來了解音頻何時正在播放/即將播放。基本上,在音頻播放揚聲器/耳機插孔之前,您會收到音頻分流回撥。

一些併發症:我不知道如何獲得AVAsset軌道的一些流類型(PLS,icecast),但遠程(和本地)MP3文件正常工作。

var player: AVPlayer? 

func doit() { 
    let url = NSURL(string: "URL TO YOUR POSSIBLY REMOTE AUDIO FILE")! 
    let asset = AVURLAsset(URL:url) 
    let playerItem = AVPlayerItem(asset: asset) 

    let tapProcess: @convention(c) (MTAudioProcessingTap, CMItemCount, MTAudioProcessingTapFlags, UnsafeMutablePointer<AudioBufferList>, UnsafeMutablePointer<CMItemCount>, UnsafeMutablePointer<MTAudioProcessingTapFlags>) -> Void = { 
     (tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut) -> Void in 

     // Audio coming out! 
     let status = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) 
     print("get audio: \(status)\n") 
    } 

    var callbacks = MTAudioProcessingTapCallbacks(
     version: kMTAudioProcessingTapCallbacksVersion_0, 
     clientInfo: UnsafeMutablePointer(Unmanaged.passUnretained(self).toOpaque()), 
     `init`: nil, 
     finalize: nil, 
     prepare: nil, 
     unprepare: nil, 
     process: tapProcess) 

    var tap: Unmanaged<MTAudioProcessingTap>? 
    let err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap) 

    if err != noErr { 
     // TODO: something 
    } 

    let audioTrack = playerItem.asset.tracksWithMediaType(AVMediaTypeAudio).first! 
    let inputParams = AVMutableAudioMixInputParameters(track: audioTrack) 
    inputParams.audioTapProcessor = tap?.takeUnretainedValue() 

    let audioMix = AVMutableAudioMix() 
    audioMix.inputParameters = [inputParams] 

    playerItem.audioMix = audioMix 

    player = AVPlayer(playerItem: playerItem) 
    player?.play() 
} 
+0

謝謝,我會試試這個報告 – MikeG

相關問題