Skip to content

Argo for model serving #1127

Answered by agilgur5
sachinruk asked this question in Q&A
Jul 13, 2024 · 1 comments · 1 reply
Discussion options

You must be logged in to vote

For reference, this was x-posted to #argo-workflows Slack, where I responded and said that while Argo can quite easily fit a batch serving model, for API-driven real-time serving, KServe/Seldon/etc are a better fit and I have used them respectively for batch vs real-time inference.

You can also use Workflows to create Deployments or InferenceServices (i.e. your MLOps pipelines), but CD may suffice for that too.

In short, there are purpose built tool stacks for each of these things, although you can certainly mix some parts together.

Also this sounds like it should've been a Discussion rather than an issue. EDIT: This has now been converted into a Discussion

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@elliotgunton
Comment options

Answer selected by elliotgunton
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants
Converted from issue

This discussion was converted from issue #1126 on July 15, 2024 09:02.