我正在尝试使用 WebRTC 从 Android 上的 Chrome 以 640x480 的分辨率流式传输视频,视频从那个开始,但随后分辨率下降到 320x240。
这是发送的 getUserMedia 参数:
"getUserMedia": [
{
"origin": "http://webrtc.example.com:3001",
"pid": 30062,
"rid": 15,
"video": "mandatory: {minWidth:640, maxWidth:640, minHeight:480, maxHeight:480}"
}
我的问题是为什么决议会下降?当我在我的 Mac 上通过 Chrome 尝试时,并没有发生这种情况。我想进行调整以使视频分辨率不变。
video frames dumped using ffmpeg
最佳答案
getUserMedia
约束仅影响从浏览器请求到硬件并作为流返回的媒体。 getUserMedia
约束对之后对该流所做的操作没有任何影响(即,当它通过连接流式传输时)。您看到的降级发生在 PeerConnection
层,而不是 getUserMedia
层。当硬件和带宽统计数据表明性能低下时,降级由 webrtc 实现触发,并由双方协商。
[Hardware] <- getUserMedia -> [javascript client] <- PeerConnection -> [another client]
<- 640x480 captured -> <- 320x240 sent ->
您必须深入研究源代码以获取文档和在每个实现中如何完成的证据,但对行为的引用:
The good news is that the WebRTC audio and video engines work together with the underlying network transport to probe the available bandwidth and optimize delivery of the media streams. However, DataChannel transfers require additional application logic: the application must monitor the amount of buffered data and be ready to adjust as needed.
...
WebRTC audio and video engines will dynamically adjust the bitrate of the media streams to match the conditions of the network link between the peers. The application can set and update the media constraints (e.g., video resolution, framerate, and so on), and the engines do the rest—this part is easy.
关于android - 为什么通过 WebRTC 从 Android 流式传输时视频分辨率会发生变化,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30742963/