Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add picture timing requirement to Ultra Low latency Broadcast with Fanout use case #130

Open
murillo128 opened this issue Mar 5, 2024 · 0 comments

Comments

@murillo128
Copy link

murillo128 commented Mar 5, 2024

In traditional live streaming it is common place to send per-frame picture timing information, typically using the H264 Picture timing SEI messages (ITU-T REC H.264 annex D.2.3).

This timing information is later used to synchronize out of band events like ad insertion using SCTE 35 splice events (SCTE 35 2023r1 section 6.3). Per frame picture timing is widely used in video editing tools.

Implementation/spec wise, we can use the abs-catpure-time header extension which is exposed in RTCRtpContributingSource captureTimestamp WebRTC Extension .

Retrieving the picture timing information of the rendered frame is quite clumbsy at the moment at it requires interpolating values using the rtp timestamps of the frame which is the only available value apis exposing frames.

An attempt to add the capture timestamp values is being doing at different apis:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant