-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE} Deepseek R1 as a CrewAI assistant #2004
Comments
You can use DeepSeek R1 model. In my country, I can't purchase DeepSeek API key, so I am using it through openrouter. Also, if you want to use DeepSeek-R1-Distill-Llama-70b hosted by Grok, then you can. Check out: LiteLLM, LiteLLM Groq And, if you can purchase DeepSeek API key, then you can directly use it without the need of openrouter. Check this out: LiteLLM DeepSeek. Hope this helps. |
Yes but the crewai requirements (0.100.0) doesn't allow to install litellm version that add supports to DeepSeek https://docs.litellm.ai/release_notes/v1.59.8-stable Any plan to upgrade this package? |
@chiora93 By the time, I would suggest to use DeepSeek through OpenRouter. I have tried that and its working for me. |
Currently, with the latest version of crewAI, liteLLM has been updated to v1.59.8 as @chiora93 mentioned. |
Feature Area
Core functionality
Is your feature request related to a an existing bug? Please link it here.
Can a CrewAI assistant be powered by Deepseek R1 model? And can Deepseek R1 be supported without any issues for agents?
Describe the solution you'd like
I want to be able to switch to Deepseek R1 as a CrewAI assistant. And also I want to use it as an LLM for any agent
Describe alternatives you've considered
it would probably be better if it the LLM is DeepSeek-R1-Distill-Llama-70b hosted by Grok.
Additional context
No response
Willingness to Contribute
I can test the feature once it's implemented
The text was updated successfully, but these errors were encountered: