-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Section 3.2: Feedback related to WebRTC-NV Low Latency Streaming Use Case #103
Comments
AFAICT, the existing examples referenced in this usecase are using DataChannels to re-distribute media initially received over non-WebRTC transports (DASH etc, necessarily "high" latency) - at least I couldn't find any references online to anything else. They show there's demand in the market for such P2P media redistribution at least. Currently it doesn't seem possible without copying the data out of WebRTC and rolling your own jitter buffering, decoding, BWE etc: given an encoded frame coming from an RTCRtpReceiver encoded transform's readable, there's no way to send this to multiple other peers over encoded transforms as RTCEncoded{Audio|Video}Frame is lacking a constructor: likely should be part of N39? Does anyone have contacts at the cited industry examples eg Peer5 who we could pull in to clarify what's currently possible and what's not but they'd like to be able to do? |
@tonyherre RTCDataChannel has been the primary vehicle for p2p distribution because the content has been containerized media, which is rendered via MSE. Shachar Zohar of Peer5 is in my group, so we can invite him to speak about current limitations. I believe these include the need for a worker-based pipeline (Chromium supports MSEv2 in workers, but not RTCDataChannel in workers, Safari has the opposite situation). Shachar may also have comments about RTCDataChannel buffering and congestion control. |
I think there is value in specifying use-cases that have been met - especially if they use a combination of APIs that aren't otherwise combined. Perhaps this isn't the right document for them, but I do think they should be somewhere. |
In the IETF, there is a distinction between a requirements document and an applicability statement. An applicability statement delineates the appropriate use of a technology. However, the goal of a requirements documents is to determine the requirements - and just because a use case isn't listed, doesn't mean that it isn't "sanctioned". |
This issue was mentioned in WEBRTCWG-2023-05-16 (Page 14) |
Requirement N37 (codec performance and copies) seems to be mixing two very distinct problems. In both WEBRTC WG and MEDIA WG we have discussed issues relating to hardware acceleration (error handling in particular). But the requirement is not specific about what is needed, such as the ability to surface the cause of hardware-related errors (e.g. error in the data provided to the hw decoder, versus resource exhaustion). That is something that has surfaced, which the current API does not address. As discussed at the June interim today, Videoframe copying issues (such as color correction or conversion to WebGPU external textures) are important but are being worked on by MEDIA WG and WebGPU WG, so they aren't owned by the WebRTC WG. So there is a question about whether this is a requirement for this WG and if so, whether we can be more specific (e.g. issues with VideoFrame ownership in transferrable streams). My proposal would be to split requirement N37 into more specific requirements. One requirement would relate to hw acceleration errors. Another one might relate to efficient transfer of media to workers (something relevant to transferrable MediaStreamTracks as well as transferrable streams of VideoFrames). With respect to N38, I believe that for uncontainerized media it is addressed by (jitterBufferTarget). I'm not sure that's a reason to remove it, though. With respect to N39, I think we need to consider containerized and non-containerized media separately. For containerized media, I believe that N38 and N39 are addressed by RTCDataChannel in workers combined with Managed Media Source API in workers (which addresses MSE power consumption). However, I'm not clear whether these APIs work well together. For non-containerized media, it isn't clear to me that requirement N39 is met by Encrypted Transform, because there can be multiple |
Add the constraints to Cloud Gaming application that allow users to use the application without accessing any user media resources. Fixes w3c#103
Add the constraints to Cloud Gaming application that allow users to use the application without accessing any user media resources. Partial fix for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
This issue was discussed in WebRTC July 2023 meeting – (Issue #103: Feedback related to WebRTC-NV Low Latency Streaming Use Case) |
Discussed at TPAC 2023. One resolution was that N38 is satisfied by Discussion on specific performance requirements is proceeding (see PR 118). |
This issue was mentioned in WEBRTCWG-2023-09-12 (Page 27) |
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
This issue was mentioned in WebRTC December 5 2023 meeting – 05 December 2023 (Game Streaming) |
Partial fixes for w3c#103 Updated requiements by the feedbacks frome the First December WebRTC WG Virtual Interim (12-05-2023)
Partial fixes for w3c#103 Updated requiements by the feedbacks frome the First December WebRTC WG Virtual Interim (12-05-2023)
Partial fixes for w3c#103 Updated requiements by the feedbacks frome the First December WebRTC WG Virtual Interim (12-05-2023)
Partial fixes for w3c#103 Updated requiements by the feedbacks frome the First December WebRTC WG Virtual Interim (12-05-2023)
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
Partial fixes for w3c#103
The various use cases presented in this section are already deployed so I wonder whether they qualify as NV.
My understanding is that use cases should identify shortcomings (via new requirements) to the current WebRTC support.
It is difficult to assess whether these use cases identify shortcomings.
The term “low latency” in particular is vague even in the context of WebRTC.
Are we trying to go to ultra low latency where waiting for an RTP assembled video frame by the proxy is not good enough?
Looking at some of the requirements, I am unclear how we will be able to leverage them:
The text was updated successfully, but these errors were encountered: