Closed
Description
In the recent editors call, @aboba mentioned receiving questions on using MediaStreamTrackProcessor w/ WebCodecs. We have some code demonstrating this in the explainer. Maybe its just a matter of sharing that more broadly? Or is there some aspect of the integration we need to highlight?
The explainer is somewhat psuedo-code, so I've made a similar demo here that folks can actually run in Chrome Canary (92).
The code looks like this
// Setup trackProcessor to grab every VideoFrame from the camera
let stream = await navigator.mediaDevices.getUserMedia({
audio: false,
video: true
});
let trackProcessor = new MediaStreamTrackProcessor(
stream.getVideoTracks()[0]
);
// Log encoder outputs. A real app would do something like send to a remote peer.
function handleChunk(chunk, metadata) {
console.log("handleChunk( timestamp=" + chunk.timestamp + " )");
}
// Setup VideoEncoder
let videoEncoder = new VideoEncoder({
output: handleChunk,
error: e => {
console.log(e.message);
}
});
// Read frames and encode()!
const reader = trackProcessor.readable.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) return;
if (videoEncoder.state == "unconfigured") {
videoEncoder.configure({
codec: "vp8",
width: value.cropWidth,
height: value.cropHeight
});
}
videoEncoder.encode(value);
value.close();
}