-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can we assign task to a group of workers ? #29
Comments
so, you want dynamically resolve a worker name/type ( rather then statically through a if this something could be achieved by FTP mechanism? A "supplemental" web application which handle a request will make a decision (when write up a tigger file) about what worker a build has to be run on ... |
Not dynamically, more dumbly like a round robin. But the question is more, is there a feature in Sparrow6 to run a job:
|
it's possible to run sparrowdo scenario on group of hosts, by using a Sparky backend using hosts file syntax. It's even possible to have tags semantic allowing sub grouping within groups and assigning named attributes to hosts ( ala chef nodes attributes ) ... We gradually uncover all the Sparky/Sparrowdo/Sparrow treasures through these tickets, ha-ha 😄 |
Ahah yes ! This is great |
it's of course not a pure Sparky solution, but still does more or less groups of hosts asynchronous deployment. It's quite simple right now, not even a round robin, just a raku |
Now with Sparky Job API, cluster mode you can run certain tasks on certain workers, please see https://github.com/melezhik/sparky#cluster-jobs |
Looks good to me 😃 |
More on sparrow side, but can we assign a project to a group of workers, let's say "big nodes"... And some workers defined elsewhere as being "big nodes" can handle the build ?
The text was updated successfully, but these errors were encountered: