2014-03-26 59 views
0

我正在嘗試開發一個Windows應用程序(WPF),它可以通過TCP連接(本地WiFi)在系統上播放任何內容。在接收端,我有可以實時播放音頻的Windows Phone 8應用程序。從PC實時將音樂流式傳輸到Windows Phone

我已經試過:

  1. 成功建立連接和數據傳輸工作。

    void sMixListner_DataAvailable(object sender, NAudio.Wave.WaveInEventArgs e) 
    { 
        try 
        { 
         if (writer != null) 
         { 
          SocketHandler.SocketManager.sendDataToClients(e.Buffer); 
         } 
        } 
        catch { } 
    } 
    public void sendDataToClients(byte[] music) 
    { 
        foreach (Socket client in clients) 
        { 
         try 
         { 
    
          client.Send(music); 
    
         } 
         catch { } 
        } 
    
    } 
    

上面的代碼發送任何數據從WasapiLoopbackCapture類接收。

  1. 在電話端,我能夠正確接收所有byte []數據,並將它寫入到MemoryStream中。 MemoryStream stream = new MemoryStream(); 公共無效ReceiveMessage(){

    var responseListener = new SocketAsyncEventArgs(); 
        responseListener.Completed += responseListener_Completed; 
    
        var responseBuffer = new byte[MAX_BUFFER_SIZE]; 
        responseListener.SetBuffer(responseBuffer, 0, MAX_BUFFER_SIZE); 
    
        _socket.ReceiveAsync(responseListener); 
    } 
    
    void responseListener_Completed(object sender, SocketAsyncEventArgs e) 
    { 
        stream.Write(e.Buffer, 0, e.BytesTransferred); 
        if (!stopReceiving) 
         ReceiveMessage(); 
    } 
    

接收它的幾秒鐘後,我試圖回放爲:

public void play() 
{ 
SoundEffect soundE = new SoundEffect(stream.ToArray(), 48000, AudioChannels.Mono); 
soundE.CreateInstance(); 
FrameworkDispatcher.Update(); 
isPlaying = true; 
soundE.Play(); 
} 

所有我聽到的是噪音。我試着寫在下面的方式WaveHeader到流(從諾基亞開發者網站):

public void WriteWavHeader(Stream stream, int sampleRate) 
    { 
     const int bitsPerSample = 32; 
     const int bytesPerSample = bitsPerSample/8; 
     var encoding = System.Text.Encoding.UTF8; 

     // ChunkID Contains the letters "RIFF" in ASCII form (0x52494646 big-endian form). 
     stream.Write(encoding.GetBytes("RIFF"), 0, 4); 

     // NOTE this will be filled in later 
     stream.Write(BitConverter.GetBytes(0), 0, 4); 

     // Format Contains the letters "WAVE"(0x57415645 big-endian form). 
     stream.Write(encoding.GetBytes("WAVE"), 0, 4); 

     // Subchunk1ID Contains the letters "fmt " (0x666d7420 big-endian form). 
     stream.Write(encoding.GetBytes("fmt "), 0, 4); 

     // Subchunk1Size 16 for PCM. This is the size of therest of the Subchunk which follows this number. 
     stream.Write(BitConverter.GetBytes(16), 0, 4); 

     // AudioFormat PCM = 1 (i.e. Linear quantization) Values other than 1 indicate some form of compression. 
     stream.Write(BitConverter.GetBytes((short)1), 0, 2); 

     // NumChannels Mono = 1, Stereo = 2, etc. 
     stream.Write(BitConverter.GetBytes((short)1), 0, 2); 

     // SampleRate 8000, 44100, etc. 
     stream.Write(BitConverter.GetBytes(sampleRate), 0, 4); 

     // ByteRate = SampleRate * NumChannels * BitsPerSample/8 
     stream.Write(BitConverter.GetBytes(sampleRate * bytesPerSample), 0, 4); 

     // BlockAlign NumChannels * BitsPerSample/8 The number of bytes for one sample including all channels. 
     stream.Write(BitConverter.GetBytes((short)(bytesPerSample)), 0, 2); 

     // BitsPerSample 8 bits = 8, 16 bits = 16, etc. 
     stream.Write(BitConverter.GetBytes((short)(bitsPerSample)), 0, 2); 

     // Subchunk2ID Contains the letters "data" (0x64617461 big-endian form). 
     stream.Write(encoding.GetBytes("data"), 0, 4); 

     // NOTE to be filled in later 
     stream.Write(BitConverter.GetBytes(0), 0, 4); 
    } 

記錄部分是正確的,因爲如果我寫使用WavFileWriter .wav文件聲音回放就好。

有人可以幫我嗎?我需要能夠在手機上播放該聲音。

回答

1

WASAPI捕獲是32位IEEE浮點,而不是32位PCM,因此您的WAV頭中的AudioFormat是錯誤的。 IEEE是浮動的,但在你的情況下,我建議轉換爲16位,然後通過網絡發送音頻以節省帶寬。

+0

其實,我從來沒有處理過很多音頻..你能給我一些鏈接來幫助我嗎? – naqvitalha

+0

我在[[關於音頻的兩個Pluralsight課程](http://pluralsight.com/training/Au​​thors/Details/mark-heath)]中詳細介紹了這一點(爲無恥插件道歉) –

相關問題