Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Piping writeable to readable #43

Open
asaphaaning opened this issue Mar 25, 2019 · 3 comments
Open

Piping writeable to readable #43

asaphaaning opened this issue Mar 25, 2019 · 3 comments

Comments

@asaphaaning
Copy link

asaphaaning commented Mar 25, 2019

I want to read from my stream in chunks of 1933 bytes. As I understand, a readablestream can be constructed to emit data upon the internal buffer reaching this given size, however, my ffmpeg call is outputting its processed data into a writable stream, not a readable stream.

I tried piping the writeable stream into a readable stream, both instances of the node-stream-buffer implementation, but that didn't work out.

Is this something that is possible?

@samcday
Copy link
Owner

samcday commented Mar 28, 2019

Hey @asaphaaning what you're describing should be possible. Can you share some code of what you're trying to achieve and explain where it's not working?

@asaphaaning
Copy link
Author

Hey @samcday thanks for answering. I have problems getting ffmpeg to produce any output from events if I specify an output stream, but if I let it create a PassThrough stream by simply calling let stream = cmd.stream() I can successfully listen to events on the stream variable.

That aside, here's what I wanted to achieve:

var ffmpeg = require('fluent-ffmpeg');
var command = ffmpeg();
var streamBuffers = require('stream-buffers');

const width = 24;
const height = 12;
const colors = 3;

var myWritableStreamBuffer = new streamBuffers.WritableStreamBuffer({
	initialSize: width * height * colors * 3, // triple buffering
	incrementAmount: width * height * colors * 3
});

var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
	frequency: 40, // 25 FPS
	chunkSize: width * height * colors + 13 // 1 full frame - 13 bytes metadata for ppm.
});

let cmd = ffmpeg('/home/haaning/Videos/input.mkv')
	.size(`${width}x${height}`)
	.videoCodec('ppm')
	.outputFormat('image2pipe');

cmd.stream(myWritableStreamBuffer, { end: true });

myWritableStreamBuffer.pipe(myReadableStreamBuffer, { end: true });

myWritableStreamBuffer.on('data', (chunk) => {
	console.log(chunk);
});

myReadableStreamBuffer.on('data', (chunk) => {
	console.log(chunk);
});

This produces an error:

Error: Cannot pipe, not readable

Now, if I try to grab my PassThrough stream like I mentioned in the beginning and pipe that into a ReadableStreamBuffer I get the error dest.write is not a function

var ffmpeg = require('fluent-ffmpeg');
var command = ffmpeg();
var streamBuffers = require('stream-buffers');

const width = 24;
const height = 12;
const colors = 3;

var myWritableStreamBuffer = new streamBuffers.WritableStreamBuffer({
	initialSize: width * height * colors * 3, // triple buffering
	incrementAmount: width * height * colors * 3
});

var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
	frequency: 40, // 25 FPS
	chunkSize: width * height * colors + 13 // 1 full frame - 13 bytes metadata for ppm.
});

let cmd = ffmpeg('/home/haaning/Videos/input.mkv')
	.size(`${width}x${height}`)
	.videoCodec('ppm')
	.outputFormat('image2pipe');

let stream = cmd.stream();

stream.pipe(myReadableStreamBuffer, { end: true });

myReadableStreamBuffer.on('data', (chunk) => {
	console.log(chunk);
});

@asaphaaning
Copy link
Author

I suppose something like this would work though, is this an anti-pattern?

var ffmpeg = require('fluent-ffmpeg');
var command = ffmpeg();
var streamBuffers = require('stream-buffers');

const width = 24;
const height = 12;
const colors = 3;

var myWritableStreamBuffer = new streamBuffers.WritableStreamBuffer({
	initialSize: width * height * colors * 3, // triple buffering
	incrementAmount: width * height * colors * 3
});

var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
	frequency: 40, // 25 FPS
	chunkSize: width * height * colors + 13 // 1 full frame - 13 bytes metadata for ppm.
});

let cmd = ffmpeg('/home/haaning/Videos/input.mkv')
	.size(`${width}x${height}`)
	.videoCodec('ppm')
	.outputFormat('image2pipe');

let passthrough = cmd.stream();

passthrough.on('data', (chunk) => {
	myReadableStreamBuffer.put(chunk);
});

myReadableStreamBuffer.on('data', (chunk) => {
	console.log(chunk);
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants