我收到錯誤信息,其中我的TwilioVideo模塊(預計捕獲器(相機或麥克風))未接收到該輸入。在我們切換到Cocoapods以安裝SDK和PureLayout UI庫後,此錯誤開始發生。以前我們手動將所有這些依賴關係安裝到XCode中。TwilioVideo iOS SDK未提供給TVIVideoCapturer的捕獲器(iPhone攝像頭)
我正在開發React Native iOS 0.40.0版本,其中react-native-cli版本爲1.0.0。我正在使用XCode版本8.2.1(8C1002),並在iOS 10.2上運行iPhone 6模擬器。我正在使用Cocoapods 1.2.0版。我正在使用TwilioVideo SDK版本1.0.0-beta5。還有一個1.0.0-beta6版本,我也試過(結果相同)。恢復到版本1.0.0-beta4確實消除了這個錯誤,這向我提出了一個我註冊音頻和視頻軌道的方式的問題。
這裏是我的Podfile:
source 'https://github.com/CocoaPods/Specs'
source 'https://github.com/twilio/cocoapod-specs'
target 'MyApp' do
# Uncomment the next line if you're using Swift or would like to use dynamic frameworks
# use_frameworks!
# Pods for MyApp
pod 'TwilioVideo', '1.0.0-beta5'
pod 'PureLayout', '~> 3.0'
target 'MapleNativeProviderTests' do
inherit! :search_paths
# Pods for testing
end
end
我已經實現了基於這個倉庫在Xcode中TwilioVideo模塊:react-native-twilio-video-webrtc。他最近更新了存儲庫以用於React Native 0.40.0,它改變了XCode的導入語法。我曾與老進口語法和新導入語法都嘗試了,我繼續得到以下錯誤,當我嘗試安裝我的視頻組件:
下面是TwilioVideo SDK的文檔。這是TVIVideoCapturer。
我對react-native-twilio-video-webrtc進行了修改,它基本上只是使用RCT_EXPORT_METHOD
來顯示關鍵API方法的TwilioVideo SDK的薄包裝。該庫在init方法中初始化音頻和視頻軌道,這會導致一些令人厭煩的行爲,使得事件監聽器在應用程序啓動時沒有收到回調。所以我將這些曲目移到了一個定製的公開曝光的RCT_EXPORT_METHOD
,名爲initialize
。我從應用程序中的特定視圖調用,該視圖掛載視頻並初始化攝像頭/麥克風輸入。
我的TWVideoModule.m
實現如下:
#import "TWVideoModule.h"
static NSString* roomDidConnect = @"roomDidConnect";
static NSString* roomDidDisconnect = @"roomDidDisconnect";
static NSString* roomDidFailToConnect = @"roomDidFailToConnect";
static NSString* roomParticipantDidConnect = @"roomParticipantDidConnect";
static NSString* roomParticipantDidDisconnect = @"roomParticipantDidDisconnect";
static NSString* participantAddedVideoTrack = @"participantAddedVideoTrack";
static NSString* participantRemovedVideoTrack = @"participantRemovedVideoTrack";
static NSString* participantAddedAudioTrack = @"participantAddedAudioTrack";
static NSString* participantRemovedAudioTrack = @"participantRemovedAudioTrack";
static NSString* participantEnabledTrack = @"participantEnabledTrack";
static NSString* participantDisabledTrack = @"participantDisabledTrack";
static NSString* cameraDidStart = @"cameraDidStart";
static NSString* cameraWasInterrupted = @"cameraWasInterrupted";
static NSString* cameraDidStopRunning = @"cameraDidStopRunning";
@interface TWVideoModule() <TVIParticipantDelegate, TVIRoomDelegate, TVIVideoTrackDelegate, TVICameraCapturerDelegate>
@end
@implementation TWVideoModule
@synthesize bridge = _bridge;
RCT_EXPORT_MODULE();
- (dispatch_queue_t)methodQueue
{
return dispatch_get_main_queue();
}
- (NSArray<NSString *> *)supportedEvents
{
return @[roomDidConnect,
roomDidDisconnect,
roomDidFailToConnect,
roomParticipantDidConnect,
roomParticipantDidDisconnect,
participantAddedVideoTrack,
participantRemovedVideoTrack,
participantAddedAudioTrack,
participantRemovedAudioTrack,
participantEnabledTrack,
participantDisabledTrack,
cameraDidStopRunning,
cameraDidStart,
cameraWasInterrupted];
}
- (instancetype)init
{
self = [super init];
if (self) {
UIView* remoteMediaView = [[UIView alloc] init];
//remoteMediaView.backgroundColor = [UIColor blueColor];
//remoteMediaView.translatesAutoresizingMaskIntoConstraints = NO;
self.remoteMediaView = remoteMediaView;
UIView* previewView = [[UIView alloc] init];
//previewView.backgroundColor = [UIColor yellowColor];
//previewView.translatesAutoresizingMaskIntoConstraints = NO;
self.previewView = previewView;
}
return self;
}
- (void)dealloc
{
[self.remoteMediaView removeFromSuperview];
self.remoteMediaView = nil;
[self.previewView removeFromSuperview];
self.previewView = nil;
self.participant = nil;
self.localMedia = nil;
self.camera = nil;
self.localVideoTrack = nil;
self.videoClient = nil;
self.room = nil;
}
RCT_EXPORT_METHOD(initialize) {
self.localMedia = [[TVILocalMedia alloc] init];
self.camera = [[TVICameraCapturer alloc] init];
NSLog(@"Camera %@", self.camera);
self.camera.delegate = self;
self.localVideoTrack = [self.localMedia addVideoTrack:YES
capturer:self.camera
constraints:[self videoConstraints]
error:nil];
self.localAudioTrack = [self.localMedia addAudioTrack:YES];
if (!self.localVideoTrack) {
NSLog(@"Failed to add video track");
} else {
// Attach view to video track for local preview
[self.localVideoTrack attach:self.previewView];
}
}
本文件的其餘部分涉及添加和刪除軌道和接合/從Twilio信道斷開,所以沒有包括它。我也有TWVideoPreviewManager
和TWRemotePreviewManager
,它只是爲本地和遠程視頻流的媒體對象提供UIViews。
我TwilioVideoComponent.js
成分是:
import React, { Component, PropTypes } from 'react'
import {
NativeModules,
NativeEventEmitter
} from 'react-native';
import {
View,
} from 'native-base';
const {TWVideoModule} = NativeModules;
class TwilioVideoComponent extends Component {
state = {};
static propTypes = {
onRoomDidConnect: PropTypes.func,
onRoomDidDisconnect: PropTypes.func,
onRoomDidFailToConnect: PropTypes.func,
onRoomParticipantDidConnect: PropTypes.func,
onRoomParticipantDidDisconnect: PropTypes.func,
onParticipantAddedVideoTrack: PropTypes.func,
onParticipantRemovedVideoTrack: PropTypes.func,
onParticipantAddedAudioTrack: PropTypes.func,
onParticipantRemovedAudioTrack: PropTypes.func,
onParticipantEnabledTrack: PropTypes.func,
onParticipantDisabledTrack: PropTypes.func,
onCameraDidStart: PropTypes.func,
onCameraWasInterrupted: PropTypes.func,
onCameraDidStopRunning: PropTypes.func,
...View.propTypes,
};
_subscriptions = [];
constructor(props) {
super(props);
this.flipCamera = this.flipCamera.bind(this);
this.startCall = this.startCall.bind(this);
this.endCall = this.endCall.bind(this);
this._eventEmitter = new NativeEventEmitter(TWVideoModule)
}
//
// Methods
/**
* Initializes camera and microphone tracks
*/
initializeVideo() {
TWVideoModule.initialize();
}
flipCamera() {
TWVideoModule.flipCamera();
}
startCall({roomName, accessToken}) {
TWVideoModule.startCallWithAccessToken(accessToken, roomName);
}
endCall() {
TWVideoModule.disconnect();
}
toggleVideo() {
TWVideoModule.toggleVideo();
}
toggleAudio() {
TWVideoModule.toggleAudio();
}
_unregisterEvents() {
this._subscriptions.forEach(e => e.remove());
this._subscriptions = []
}
_registerEvents() {
this._subscriptions = [
this._eventEmitter.addListener('roomDidConnect', (data) => {
if (this.props.onRoomDidConnect) {
this.props.onRoomDidConnect(data)
}
}),
this._eventEmitter.addListener('roomDidDisconnect', (data) => {
if (this.props.onRoomDidDisconnect) {
this.props.onRoomDidDisconnect(data)
}
}),
this._eventEmitter.addListener('roomDidFailToConnect', (data) => {
if (this.props.onRoomDidFailToConnect) {
this.props.onRoomDidFailToConnect(data)
}
}),
this._eventEmitter.addListener('roomParticipantDidConnect', (data) => {
if (this.props.onRoomParticipantDidConnect) {
this.props.onRoomParticipantDidConnect(data)
}
}),
this._eventEmitter.addListener('roomParticipantDidDisconnect', (data) => {
if (this.props.onRoomParticipantDidDisconnect) {
this.props.onRoomParticipantDidDisconnect(data)
}
}),
this._eventEmitter.addListener('participantAddedVideoTrack', (data) => {
if (this.props.onParticipantAddedVideoTrack) {
this.props.onParticipantAddedVideoTrack(data)
}
}),
this._eventEmitter.addListener('participantRemovedVideoTrack', (data) => {
if (this.props.onParticipantRemovedVideoTrack) {
this.props.onParticipantRemovedVideoTrack(data)
}
}),
this._eventEmitter.addListener('participantAddedAudioTrack', (data) => {
if (this.props.onParticipantAddedAudioTrack) {
this.props.onParticipantAddedAudioTrack(data)
}
}),
this._eventEmitter.addListener('participantRemovedAudioTrack', (data) => {
if (this.props.onParticipantRemovedAudioTrack) {
this.props.onParticipantRemovedAudioTrack(data)
}
}),
this._eventEmitter.addListener('participantEnabledTrack', (data) => {
if (this.props.onParticipantEnabledTrack) {
this.props.onParticipantEnabledTrack(data)
}
}),
this._eventEmitter.addListener('participantDisabledTrack', (data) => {
if (this.props.onParticipantDisabledTrack) {
this.props.onParticipantDisabledTrack(data)
}
}),
this._eventEmitter.addListener('cameraDidStart', (data) => {
if (this.props.onCameraDidStart) {
this.props.onCameraDidStart(data)
}
}),
this._eventEmitter.addListener('cameraWasInterrupted', (data) => {
if (this.props.onCameraWasInterrupted) {
this.props.onCameraWasInterrupted(data)
}
}),
this._eventEmitter.addListener('cameraDidStopRunning', (data) => {
if (this.props.onCameraDidStopRunning) {
this.props.onCameraDidStopRunning(data)
}
})
]
}
componentWillMount() {
this._eventEmitter.addListener('cameraDidStart', (data) => {
if (this.props.onCameraDidStart) {
this.props.onCameraDidStart(data)
}
});
this._registerEvents()
}
componentWillUnmount() {
this._unregisterEvents()
}
render() {
return this.props.children || null
}
}
export default TwilioVideoComponent;
我不知道如何修改的XCode有一個與TwilioVideo beta5的API兼容性。任何幫助,將不勝感激。