2017-03-10 261 views
0

默認情況下,WebRTC視頻採用前置攝像頭,可以正常工作。但是,我需要將其切換回相機,並且我一直無法找到任何代碼來執行此操作。 我需要編輯哪個部分? 它是localView或localVideoTrack或capturer?iOS SWIFT - WebRTC從前置攝像頭變爲後置攝像頭

+0

您正在使用的WebRTC庫或openwebrtc檢查下面的代碼圖書館 ? –

+0

@ DuraiAmuthan.H我正在使用libjingle_peerconnection pod – mrnobody

+1

嘗試從AVCaptureSession中刪除現有的AVCameraInput,然後使用AVCaptureDevicePositionBack將新的AVCameraInput添加到AVCaptureSession中 –

回答

1

我不確定哪個chrome版本用於webrtc,但使用v54及以上版本時,在RTCAVFoundationVideoSource類中有稱爲「useBackCamera」的「bool」屬性。您可以利用此屬性在前/後相機之間切換。

+0

頭文件:https://cs.chromium.org/chromium/src/third_party/webrtc/sdk/objc/Framework/Headers/WebRTC/RTCAVFoundationVideoSource.h?rcl=c0a0f8d5f2d5b837b4c6ea447b0cdce86723e0d6&l=46 –

0

夫特3.0

等連接可僅具有用於發送視頻流的一個 'RTCVideoTrack'。

首先,對於更換相機前/後,您必須刪除對等連接上的當前視頻軌道。 然後,您可以在相機上創建您需要的新「RTCVideoTrack」,並將其設置爲對等連接。

我用過這種方法。

func swapCameraToFront() { 
    let localStream: RTCMediaStream? = peerConnection?.localStreams.first as? RTCMediaStream 
    localStream?.removeVideoTrack(localStream?.videoTracks.first as! RTCVideoTrack) 
    let localVideoTrack: RTCVideoTrack? = createLocalVideoTrack() 
    if localVideoTrack != nil { 
     localStream?.addVideoTrack(localVideoTrack) 
     delegate?.appClient(self, didReceiveLocalVideoTrack: localVideoTrack!) 
    } 
    peerConnection?.remove(localStream) 
    peerConnection?.add(localStream) 
} 

func swapCameraToBack() { 
    let localStream: RTCMediaStream? = peerConnection?.localStreams.first as? RTCMediaStream 
    localStream?.removeVideoTrack(localStream?.videoTracks.first as! RTCVideoTrack) 
    let localVideoTrack: RTCVideoTrack? = createLocalVideoTrackBackCamera() 
    if localVideoTrack != nil { 
     localStream?.addVideoTrack(localVideoTrack) 
     delegate?.appClient(self, didReceiveLocalVideoTrack: localVideoTrack!) 
    } 
    peerConnection?.remove(localStream) 
    peerConnection?.add(localStream) 
} 
+0

雖然這可能回答這個問題,它最好解釋答案的重要部分,可能是OPs代碼的問題。 – pirho

+0

您可以添加如何使用後置攝像頭生成視頻軌道嗎?實現'createLocalVideoTrackBackCamera'方法。 – Ankit

0

目前我只有一個目標Ç回答了ANKIT的評論下面,將轉換代碼,並再次在一段時間內更新它,你可以

-(RTCVideoTrack *)createLocalVideoTrack { 

     RTCVideoTrack *localVideoTrack = nil; NSString *cameraID = nil; 
    for (AVCaptureDevice *captureDevice in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) 
    { 
    if (captureDevice.position == AVCaptureDevicePositionFront) { 
    cameraID = [captureDevice localizedName]; break; 
    } 
    } 

     RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID]; RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints]; RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints]; localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource]; 

     return localVideoTrack; } 

     - (RTCVideoTrack *)createLocalVideoTrackBackCamera { 
      RTCVideoTrack *localVideoTrack = nil; 

      //AVCaptureDevicePositionFront 
      NSString *cameraID = nil; 
      for (AVCaptureDevice *captureDevice in 
       [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) { 
       if (captureDevice.position == AVCaptureDevicePositionBack) { 
        cameraID = [captureDevice localizedName]; 
        break; 
       } 
      } 


      RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID]; 
      RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints]; 
      RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints]; 
      localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource]; 

      return localVideoTrack; 
     } 
+0

這個答案有一些嚴重的格式問題。請編輯和格式化您的答案,否則我將不得不舉牌。 –

+0

ok會再次更新它 –