2013-09-26 68 views
2

我正在爲我的論文做一個VoIP應用程序。我想知道是否有人可以幫助我這種情況下: 我有兩個線程,AudioThread和AudioSendThread。第一個是通過DatagramSocket接收音頻數據包並在手機中播放的收聽者。第二個是錄音機,可以抓住20毫秒的聲音並將其發送到另一個設備。我已經在java中實現了它,但是它非常慢,所以我決定嘗試OpenSL,但是我沒有找到任何這樣的文檔。使用OpenSL的Android語音通話

這是AudioSendThread

public class AudioSendThread implements Runnable { 
private final static String TAG = "AudioSndThread"; 
private boolean createdAudioP = false; 
private DatagramSocket audioSndSocket; 
private String ipAddr; 
private byte[] buffer; 

public AudioSendThread(Object o){ 
    this.ipAddr = //getting IpAddress 
    audioSndSocket = (DatagramSocket)o; 
} 

@Override 
public void run() { 
    if(!createdAudioP) 
     createdAudioP = createAudioRecorder(); 
    if(createdAudioP) 
     startRecording(); 
    DatagramPacket packet = null; 
    while(true){ 
      byte[] buffer = readAudio(20); //read 20 milliseconds of audio, this is the one i would like to implement in OpenSL 
     try { 
      packet = new DatagramPacket(buffer, buffer.length, InetAddress.getByName(this.ipAddr), PORT.AUDIO); 
      audioSndSocket.send(packet); 
     } catch (IOException e) { 
      Log.e(TAG, e.getMessage()); 
      return; 
     } 

    } 
} 

public static native void startRecording(); 
public static native boolean createAudioRecorder(); 
public static native byte[] readAudio(int millis); 

static { 
    System.loadLibrary("SoundUtils"); 
}} 

而且thisone的AudioThread

public class AudioThread implements Runnable{ 
private DatagramSocket audioServSock; 

@Override 
public void run() { 
      createBufferQueueAudioPlayer(); 
    DatagramPacket packet = null; 
    Thread audioSndThread = null; 
    try { 
     this.audioServSock = new DatagramSocket(PORT.AUDIO); 
    } catch (SocketException e1) { 
     e1.printStackTrace(); 
    } 
    if(true){ 
     audioSndThread = new Thread(new AudioSendThread(this.audioServSock)); 
     audioSndThread.start(); 
    } 
      byte[] buffer = new buffer[1500]; //random size 
    packet = new DatagramPacket(buffer, 1500); 
    while(true){ 
     try { 
      audioServSock.receive(packet); 
      playAudio(buffer, packet.getLength()); //other method i would like to implement in OpenSL 
     } catch (IOException e) { 
      Log.e(TAG, Log.getStackTraceString(e)); 
      return; 
     }   
    } 
    at.stop(); 
    at.release(); 
} 

public static native void createBufferQueueAudioPlayer(); 
public static native void playAudio(byte[] buffer, int length); 

/** Load jni .so on initialization */ 
static { 
    System.loadLibrary("native-audio-jni"); 
} 

}

其他本地方法是採取NDK

致謝NativeAudio樣品全部對於任何建議!

+0

Android NDK包含一個名爲'native-audio'的示例應用程序,該應用程序演示瞭如何使用OpenSL ES進行記錄和回放。 – Michael

+0

我知道這一點,我試圖使用這個例子,但播放不工作..只有MP3重現它的工作。 – Oxenarf

+0

下面是Android上低延遲流媒體的最終文章:http://createdigitalmusic.com/2013/05/why-mobile-low-latency-is-hard-explained-by-google-galaxy-nexus-still-android -of-choice/ –

回答

3

您嘗試了Android-NDK提供的本地音頻示例代碼,這意味着您熟悉JNI調用。這裏有一個Victor Lazzarini的博客,描述了他使用OpenSL ES進行語音通信的音頻流的方法。

Android audio streaming with OpenSL ES and the NDK.

您可以從here. 下載源代碼,按照說明在您的設備上運行。