Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Further clarify cloud gaming #126

Closed
wants to merge 2 commits into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 45 additions & 5 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -295,9 +295,29 @@ <h4>Game streaming</h4>
<tr>
<td>N38</td>
<td>The application must be able to control the jitter buffer and rendering
delay. This requirement is addressed by jitterBufferTarget, defined in
delay as well as rendering delay variation speed. NOTE: The "control the jitter
buffer" part of this requirement is addressed by jitterBufferTarget, defined in
[[?WebRTC-Extensions]] Section 6.</td>
</tr>
<tr>
<td>N48</td>
<td>The application must be able to control video decoding to continue even
after a frame-loss without waiting for a key frame. This helps the application recover
faster from lossy network conditions.</td>
</tr>
<tr>
<td>N49</td>
<td>The application must be able to generate signals that indicate to the encoder
the loss of encoder-decoder synchronicity (DPB buffers) and the sequence
of frame loss using the platform-agnostic protocols. This helps the application recover
faster from lossy network conditions.</td>
</tr>
<tr>
<td>N50</td>
<td>The application must be able to configure RTCP feedback transmission
interval (e.g., Transport-wide RTCP Feedback Message). This helps the application adapt
the video quality to the varying network and maintain consistent latency.</td>
</tr>
</tbody>
</table>
<p>Experience: Microsoft's Xbox Cloud Gaming and NVIDIA's GeForce NOW are examples of this use case, with media
Expand Down Expand Up @@ -1010,9 +1030,10 @@ <h3>Requirements Summary</h3>
<tr id="N38">
<td>N38</td>
<td>The application must be able to control the jitter buffer and rendering
delay. This requirement is addressed by jitterBufferTarget, defined in
[[?WebRTC-Extensions]] Section 6.</td>
</tr>
delay as well as rendering delay variation speed. NOTE: The "control the jitter
buffer" part of this requirement is addressed by jitterBufferTarget, defined in
[[?WebRTC-Extensions]] Section 6.</td>
</tr>
<tr id="N39">
<td>N39</td>
<td>A user-agent must be able to forward media received from a peer
Expand Down Expand Up @@ -1065,9 +1086,28 @@ <h3>Requirements Summary</h3>
<td>The WebRTC connection can generate signals indicating demands
for keyframes, and surface those to the application.</td>
</tr>
<tr id="N48">
<td>N48</td>
<td>The application must be able to control video decoding to continue even
after a frame-loss without waiting for a key frame. This helps the application recover
faster from lossy network conditions.</td>
</tr>
<tr id="N49">
<td>N49</td>
<td>The application must be able to generate signals that indicate to the encoder
the loss of encoder-decoder synchronicity (DPB buffers) and the sequence
of frame loss using the platform-agnostic protocols. This helps the application recover
faster from lossy network conditions.</td>
</tr>
<tr id="N50">
<td>N50</td>
<td>The application must be able to configure RTCP feedback transmission
interval (e.g., Transport-wide RTCP Feedback Message). This helps the application adapt
the video quality to the varying network and maintain consistent latency.</td>
</tr>
</tbody>
</table>
<p class="note">Requirements N40-N47 have unresolved comments from a Call for Consensus (CfC).</p>
<p class="note">Requirements N40-N50 have unresolved comments from a Call for Consensus (CfC).</p>
</section>
</body>
</html>
Loading