2014-07-02 64 views
0

我有一個客戶端.ipa文件,我在我的iOS設備上測試,現在我已經成功地通過使用Adobe Air for iOS在iPhone上運行該應用程序,我正在使用Adobe Flash CC。從iPhone相機流到RTMP服務器的iOS版本的AIR AIR

當我在iPhone上啓動應用程序時,視頻連接沒有連接到red5流媒體服務器,因此無法從攝像機向服務器廣播流。

我已經使用stagevideo。當我使用網絡攝像頭在本地電腦上啓動應用並在iOS上啓動另一個應用以接收iPhone上的流時,我可以從我的電腦攝像頭看到實時廣播。

但我想測試iPhone攝像頭併發送來自red5服務器的接收實時流。

我該如何做到這一點。我已經在下面放置了當前的代碼。

   import flash.display.Sprite; 

    import flash.display.MovieClip; 

import flash.events.NetStatusEvent; 

import flash.net.NetConnection; 

import flash.net.NetStream; 

import flash.media.Camera; 

import flash.media.Microphone; 

import flash.media.Video; 

import flash.net.Responder; 

import flash.media.StageVideo; 
import flash.events.StageVideoAvailabilityEvent; 
import flash.events.StageVideoEvent; 
import flash.geom.Rectangle; 

    var nc:NetConnection; 
    var good:Boolean; 

    var netOut:NetStream; 
    var netIn:NetStream; 
    var cam:Camera; 
    var mic:Microphone; 
    var responder:Responder; 
    var r:Responder; 
    var vidOut:Video; 
    var vidIn:Video; 
    var outStream:String; 
    var inStream:String; 

    var sv:StageVideo; 

     stage.addEventListener(StageVideoAvailabilityEvent.STAGE_VIDEO_AVAILABILITY, onAvail); 
    var sva:Boolean; 

    function onAvail(e:StageVideoAvailabilityEvent):void{ 
     sva = (e.availability == StageVideoAvailability.AVAILABLE); 
     trace(sva); 
     var rtmpNow:String="rtmp://192.168.1.7/test1"; 

     nc=new NetConnection; 

     nc.client = this; 

     nc.connect(rtmpNow,"trik"); 

     nc.addEventListener(NetStatusEvent.NET_STATUS,getStream); 



    } 


    function onRender(e:StageVideoEvent):void{ 
     sv.viewPort = new Rectangle(0,0, 240, 180); 

    } 

    function getStream(e:NetStatusEvent):void 
    { 
     good=e.info.code == "NetConnection.Connect.Success"; 
     if(good) 
     { 
     trace("hello"); 
       // Here we call functions in our Java Application 


     setCam(); 
     //setMic(); 
     //Play streamed video 
     netIn = new NetStream(nc); 

     if(sva){ 


     //Publish local video 
     netOut=new NetStream(nc); 
     //netOut.attachAudio(mic); 
     netOut.attachCamera(cam); 
     netOut.publish("tester", "live"); 
     sv = stage.stageVideos[0]; 

     sv.addEventListener(StageVideoEvent.RENDER_STATE, onRender); 
      sv.attachNetStream(netIn); 

      netIn.play("tester"); 

     }else{ 
     setVid(); 

     vidIn.attachNetStream(netIn); 

     netIn.play("tester"); 
     } 






     } 
    } 


    function streamNow(streamSelect:Object):void 
    { 
     trace("hello"); 


    } 

    function setCam():void 
    { 
     cam=Camera.getCamera(); 
     cam.setMode(240,180,15); 
     cam.setQuality(0,85); 
    } 

    function setMic():void 
    { 
     trace("hello"); 
     mic=Microphone.getMicrophone(); 
     trace("hello"); 
     mic.rate =11; 
     trace("hello"); 
     //mic.setSilenceLevel(12,2000); 
     trace("hello"); 
    } 

    function setVid():void 
    { 
     trace("vid"); 

     vidIn=new Video(240,180); 
     addChild(vidIn); 
     vidIn.x=0; 
     vidIn.y=0; 
    } 

回答

0

你的代碼大多看起來不錯,但我會分開ns.publish和ns.play部分。恕我直言,你不應該嘗試發揮,直到發佈成功。此外,如果您不只是測試服務器的往返行程,我只需將相機連接到StageVideo;如果在iOS中允許的話。