You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
let me start off by saying I am a big project fan. I came to discover motion when a week before going on a month long trip I bought a webcam to monitor our indoor pet visiting the front window every now and again. It would upload pics & videos to GDrive and it surprisingly worked like a charm right off of the bat.
Nowadays I am struggling to plug two webcams into the same USB bus due to what I read to be a bandwidth constraint. Then I've experimented some more with other software and found mjpg-streamer does not suffer from the same restriction. I managed to captured the two cameras using two distinct processes which hook up on the v4l2 uvcapi, stream then through their http output module and have them captured by motion in what dramatically reduces the CPU usage. It works but it is all very cumbersome to manage, so I am curious if the captured method couldn't be simply collapsed. For one mjpg_streamer does not seem to be getting maintained really, and I believe this project and the community would benefit in the long run.
Any additional thoughts or am I missing any configuration option I might not be aware of?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello,
let me start off by saying I am a big project fan. I came to discover motion when a week before going on a month long trip I bought a webcam to monitor our indoor pet visiting the front window every now and again. It would upload pics & videos to GDrive and it surprisingly worked like a charm right off of the bat.
Nowadays I am struggling to plug two webcams into the same USB bus due to what I read to be a bandwidth constraint. Then I've experimented some more with other software and found mjpg-streamer does not suffer from the same restriction. I managed to captured the two cameras using two distinct processes which hook up on the v4l2 uvc api, stream then through their http output module and have them captured by motion in what dramatically reduces the CPU usage. It works but it is all very cumbersome to manage, so I am curious if the captured method couldn't be simply collapsed. For one mjpg_streamer does not seem to be getting maintained really, and I believe this project and the community would benefit in the long run.
Any additional thoughts or am I missing any configuration option I might not be aware of?
Thank you
Beta Was this translation helpful? Give feedback.
All reactions