更新最新的 webrtc 框架后,我不知道如何向用户显示本地流,因为方法已更改,在存储库的“iOS”文件夹中没有示例。
在旧代码中...
RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
RTCVideoCapturer 对象和 RTCVideoSource 对象在这里相互链接。
但是在新代码中...
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack = [_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
彼此之间没有联系。 所以,委托(delegate)方法做了什么, [_delegate appClient:self didCreateLocalCapturer:capturer]; 我不明白。 [需要帮助!]
最佳答案
在视频通话 View Controller 中实现这个委托(delegate)方法....
- (void)appClient:(ARDAppClient *)client didCreateLocalCapturer:(RTCCameraVideoCapturer *)localCapturer{
NSLog(@"%s %@",__PRETTY_FUNCTION__ ,localCapturer);
_captureController = [[ARDCaptureController alloc] initWithCapturer:localCapturer
settings:[[ARDSettingsModel alloc] init]];
[_captureController startCapture];
}
然后....这个方法调用它来创建相同的...
- (RTCVideoTrack *)createLocalVideoTrack {
RTCVideoTrack* localVideoTrack = nil;
// The iOS simulator doesn't provide any sort of camera capture
// trying to open a local stream.
#if !TARGET_IPHONE_SIMULATOR
if (![_settings currentAudioOnlySettingFromStore]) {
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack = [_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
[_delegate appClient:self didReceiveLocalVideoTrack:localVideoTrack];
}
然后调用...
_localVideoTrack = [self createLocalVideoTrack];
在你的初始化方法中......
- (void)initCall {
NSLog(@"%s",__PRETTY_FUNCTION__);
if (!_isTurnComplete) {
return;
}
self.state = kARDAppClientStateConnected;
_localVideoTrack = [self createLocalVideoTrack];
// Create peer connection.
_constraints = [self defaultPeerConnectionConstraints];
}
这段代码让我能够实现这一点!
关于ios - 如何使用WEBRTC最新框架<Anakros/WebRTC>中videoview显示localstream的方法? - 用于 webrtc 框架(iOS),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47534911/