-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature: support dumping jobs to disk. #11
Conversation
* @param {Object} [filter={}] The sift-compatible filter | ||
*/ | ||
async dump(jobType, filter = {}) { | ||
const matches = await this._find(jobType, filter); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it'd make sense to add a forEach
method so that we could write the jobs out without needing to have them all in memory.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah yeah, makes sense
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, there could be a ton of jobs here - so paginating would be ideal. It might make sense to add a batch processor to handle matches as we find them here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On that - #14
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed with @shils , we need to handle the case where there could be a very large number of jobs. Otherwise, the rest of this PR LGTM.
@@ -7,6 +7,9 @@ const redis = require('redis'); | |||
const repl = require('repl'); | |||
const sift = require('sift').default; | |||
const getValue = require('get-value'); | |||
const waitOn = require('promise-callbacks').waitOn; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const waitOn = require('promise-callbacks').waitOn; | |
const { waitOn } = require('promise-callbacks'); |
* @param {Object} [filter={}] The sift-compatible filter | ||
*/ | ||
async dump(jobType, filter = {}) { | ||
const matches = await this._find(jobType, filter); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, there could be a ton of jobs here - so paginating would be ideal. It might make sense to add a batch processor to handle matches as we find them here.
await waitOn(jsonWriteStream, 'finish', { waitError: true }); | ||
console.log(`Dumped ${cleanedMatches.length} jobs to ${filename}`); | ||
} catch (err) { | ||
console.error('Failed to write to fail', err); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
console.error('Failed to write to fail', err); | |
console.error('Failed to write to file', err); |
Resolves: #2