Replies: 1 comment
-
You can adapt it to not use function calling, it's just a bit more work... would have to write output parsers that work with your prompt |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I was wondering if it is possible to implement Langchain reflexion ( https://github.com/langchain-ai/langgraph/blob/main/examples/reflexion/reflexion.ipynb) with a custom LLM adapter (Predibase); I have searched the documentation but I don't see anything regarding this or how it could be done, according to the documentation in Langchain I only find support for these llms: (https://python.langchain.com/docs/modules/model_io/chat/function_calling/)
OpenAI, Anthropic, Google, Cohere, FireworksAI, MistralAI, TogetherAI
Have someone of you an idea of how this can be done?
ty
Beta Was this translation helpful? Give feedback.
All reactions