从锁定屏幕接听电话时,iOS 麦克风无法正常工作或无法通过 webrtc 发送语音

标签 ios swift webrtc voip callkit

我正在使用 Webrtc 和 Callkit 进行调用。当应用程序处于前台时,一切正常,但如果屏幕被锁定并且我接听电话,音频只能在我身边工作(我可以听到音频但我的声音没有发送)。

当用户进入应用程序时,一切都已修复。

所有后台设置和功能均已正确设置。

<key>UIBackgroundModes</key>
   <array>
        <string>audio</string>
        <string>fetch</string>
        <string>remote-notification</string>
        <string>voip</string>
    </array>

我尝试使用 RTCAudioSession 和 AVAudioSession 配置音频,但在这两种情况下它的工作方式相同。

解决了:
我将媒体流放在 RTCPeerConnection 中,现在我添加了 RTCMediaStreamTracks

最佳答案

请注意,我分享了我的代码及其即将满足我的需求,我分享以供引用。您需要根据需要进行更改。
当您收到 voip 通知时,创建您的 webrtc 处理类的新事件,并且
将这两行添加到代码块中,因为从 voip 通知启用 Audio Session 失败

RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false 
didReceive 方法;
func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType, completion: @escaping () -> Void) {
               let state = UIApplication.shared.applicationState
               
        
     
                   if(payload.dictionaryPayload["hangup"] == nil && state != .active
                   ){
                       
               
                     Globals.voipPayload = payload.dictionaryPayload as! [String:Any] // I pass parameters to Webrtc handler via Global singleton to create answer according to sdp sent by payload.
                        
                       RTCAudioSession.sharedInstance().useManualAudio = true
                       RTCAudioSession.sharedInstance().isAudioEnabled = false
                       
                     
                      
                     Globals.sipGateway = SipGateway() // my Webrtc and Janus gateway handler class
                    
                       
                     Globals.sipGateway?.configureCredentials(true) // I check janus gateway credentials stored in Shared preferences and initiate websocket connection and create peerconnection 
to my janus gateway which is signaling server for my environment
                    
                       
                  initProvider() //Crating callkit provider
                       
                       self.update.remoteHandle = CXHandle(type: .generic, value:String(describing: payload.dictionaryPayload["caller_id"]!))
                          Globals.callId = UUID()
             
                       let state = UIApplication.shared.applicationState
                       
                      
                          Globals.provider.reportNewIncomingCall(with:Globals.callId , update: self.update, completion: { error in
                           
                           
                          })
                       
                
               }
              
           }
    
        
        func  initProvider(){
            let config = CXProviderConfiguration(localizedName: "ulakBEL")
            config.iconTemplateImageData = UIImage(named: "ulakbel")!.pngData()
            config.ringtoneSound = "ringtone.caf"
                   // config.includesCallsInRecents = false;
                    config.supportsVideo = false
            
            Globals.provider = CXProvider(configuration:config )
            Globals.provider.setDelegate(self, queue: nil)
             update = CXCallUpdate()
             update.hasVideo = false
             update.supportsDTMF = true
      
        }
    
修改您的 didActivate 和 didDeActive 委托(delegate)函数,如下所示,
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
       print("CallManager didActivate")
       RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
       RTCAudioSession.sharedInstance().isAudioEnabled = true
      // self.callDelegate?.callIsAnswered()
    
 
   }

   func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
       print("CallManager didDeactivate")
RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
       RTCAudioSession.sharedInstance().isAudioEnabled = false
    
 
   }
在 Webrtc 处理程序类中配置媒体发送者和 Audio Session
private func createPeerConnection(webRTCCallbacks:PluginHandleWebRTCCallbacksDelegate) {
   
        let rtcConfig =  RTCConfiguration.init()
        rtcConfig.iceServers = server.iceServers
        rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
        rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
        rtcConfig.continualGatheringPolicy = .gatherContinually
        rtcConfig.sdpSemantics = .planB
        
        let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
                                                 optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
           
        pc = sessionFactory.peerConnection(with: rtcConfig, constraints: constraints, delegate: nil)
        self.createMediaSenders()
        self.configureAudioSession()
        
   
        
      if webRTCCallbacks.getJsep() != nil{
        handleRemoteJsep(webrtcCallbacks: webRTCCallbacks)
        }
      
    }
媒体发件人;
private func createMediaSenders() {
        let streamId = "stream"
        
        // Audio
        let audioTrack = self.createAudioTrack()
        self.pc.add(audioTrack, streamIds: [streamId])
        
        // Video
      /*  let videoTrack = self.createVideoTrack()
        self.localVideoTrack = videoTrack
        self.peerConnection.add(videoTrack, streamIds: [streamId])
        self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack
        
        // Data
        if let dataChannel = createDataChannel() {
            dataChannel.delegate = self
            self.localDataChannel = dataChannel
        }*/
    }

  private func createAudioTrack() -> RTCAudioTrack {
        let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
        let audioSource = sessionFactory.audioSource(with: audioConstrains)
        let audioTrack = sessionFactory.audioTrack(with: audioSource, trackId: "audio0")
        return audioTrack
    }
Audio Session ;
private func configureAudioSession() {
        self.rtcAudioSession.lockForConfiguration()
        do {
            try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
            try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
        } catch let error {
            debugPrint("Error changeing AVAudioSession category: \(error)")
        }
        self.rtcAudioSession.unlockForConfiguration()
    }
请考虑这一点,因为我使用回调和委托(delegate)代码包括委托(delegate)和回调 block 。你可以相应地忽略它们!
供引用 您也可以查看此 link 中的示例

关于从锁定屏幕接听电话时,iOS 麦克风无法正常工作或无法通过 webrtc 发送语音,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50568090/

相关文章:

ios - TableView 不重新加载

ios - Swift UITableViewCell detailTextLabel.text 抛出错误 'fatal error: Can' t unwrap Optional.None'

ios - HLS 直播流视频在首次加载时无法播放

ios - 使用单个按钮 iOS Swift 并排显示多个按钮

android - WebRTC。如何在 Android 通话期间禁用声音?

ios - 如何使用 AVPlayer 擦洗音频?

ios - 如何正确隐藏这些广告横幅?

除非窗口处于焦点状态,否则 Firefox 不会收集 ICE 候选项?

javascript - RTCPeerConnection.ontrack 事件未触发

ios - 奇怪的 iOS UIWebView 崩溃称为 WTF 崩溃