我想使用JMF 2.1.1e以RTP格式捕獲和流式傳輸音頻。我寫了一個簡單的發射器,我可以發送和接收音頻。但是當我在Wireshark中看到時,我將這些數據包看作是UDP。任何人都可以指出我的問題,請。爲什麼在使用jmf流時,它是UDP,而不是Wireshark中的RTP?
這裏是我的功能負責音頻捕獲和傳輸。
public void captureAudio(){
// Get the device list for ULAW
Vector devices = captureDevices();
CaptureDeviceInfo captureDeviceInfo = null;
if (devices.size() > 0) {
//get the first device from the list and cast it as CaptureDeviceInfo
captureDeviceInfo = (CaptureDeviceInfo) devices.firstElement();
}
else {
// exit if we could not find the relevant capturedevice.
System.out.println("No such device found");
System.exit(-1);
}
Processor processor = null;
try {
//Create a Processor for the specified media.
processor = Manager.createProcessor(captureDeviceInfo.getLocator());
} catch (IOException ex) {
System.err.println(ex);
} catch (NoProcessorException ex) {
System.err.println(ex);
}
//Prepares the Processor to be programmed.
//puts the Processor into the Configuring state.
processor.configure();
//Wait till the Processor configured.
while (processor.getState() != Processor.Configured){
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
//Sets the output content-type for this Processor
processor.setContentDescriptor(CONTENT_DESCRIPTOR);
/**
ContentDescriptor CONTENT_DESCRIPTOR
= new ContentDescriptor(ContentDescriptor.RAW_RTP);
*/
//Gets a TrackControl for each track in the media stream.
TrackControl track[] = processor.getTrackControls();
boolean encodingOk = false;
//searching through tracks to get a supported audio format track.
for (int i = 0; i < track.length; i++) {
if (!encodingOk && track[i] instanceof FormatControl) {
if (((FormatControl)
track[i]).setFormat(new AudioFormat(AudioFormat.ULAW_RTP, 8000, 8, 1)) == null)
{
track[i].setEnabled(false);
}
else {
encodingOk = true;
track[i].setEnabled(encodingOk);
System.out.println("enc: " + i);
}
} else {
// we could not set this track to ULAW, so disable it
track[i].setEnabled(false);
}
}
//If we could set this track to ULAW we proceed
if (encodingOk){
processor.realize();
while (processor.getState() != Processor.Realized){
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
DataSource dataSource = null;
try {
dataSource = processor.getDataOutput();
} catch (NotRealizedError e) {
e.printStackTrace();
}
try {
String url= "rtp://192.168.1.99:49150/audio/1";
MediaLocator m = new MediaLocator(url);
DataSink d = Manager.createDataSink(dataSource, m);
d.open();
d.start();
System.out.println("transmitting...");
processor.start();
} catch (Exception e) {
e.printStackTrace();
}
}
}
請問,如果你發現任何不正確或模糊的東西。 在此先感謝。 :)
澄清:我有C#代碼RTP流式傳輸peice的。當我捕捉使用Wireshark的的數據,我可以看到他們爲RTP,但問題是,當我從JMF Wireshark的捕獲數據流顯示它們爲UDP。而我的問題是,爲什麼?
我知道UDP和RTP的區別。
我認爲問題出在CONTENT_DESCRIPTOR上,它是raw-rtp。 – shibli049 2012-04-05 04:40:40
我知道你的代碼正在運行,除了你現在面臨的問題。我們需要在JMF源代碼中看到JMF在使用CONTENT_DESCRIPTOR時如何實現Processor類,就像Osbcure所說的那樣。也許這是C#代碼和Java JMF代碼之間的區別。小心告訴你用於C#版本的流媒體庫是什麼? – ecle 2012-04-11 04:42:12
@eee:C#項目使用pjsipDll,我剛剛從朋友那裏拿來測試wireshark中的數據包,而我不習慣使用C#。所以,不能給你任何關於C#的更多細節。 – shibli049 2012-04-11 06:56:58