Replies: 2 comments
-
yea - https://docs.litellm.ai/docs/providers all of those are supported. anything missing in docs? |
Beta Was this translation helpful? Give feedback.
0 replies
-
i resolved this issue in discord closing now :-)) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
i am asking for this is because i wanted to connect groq with flowise and currently there was no option to integrate so i thought if i create local proxy server and use local ai functionality of it to connect with groq and provide api key and chat using groq so thought to ask.
Beta Was this translation helpful? Give feedback.
All reactions