How to access other HF Inference like summarization, textToImage ... #2070
Replies: 3 comments 2 replies
-
Answer generated by a 🤖 AnswerI understand that you're trying to use the "facebook/bart-large-cnn" model from HuggingFace with LangChain, but you're encountering an issue because the To address this issue, you could modify the async _call(
prompt: string,
options: this["ParsedCallOptions"]
): Promise<string> {
const { HfInference } = await HuggingFaceInference.imports();
const hf = new HfInference(this.apiKey);
const res = await this.caller.callWithOptions(
{ signal: options.signal },
hf.textSummarization.bind(hf), // Change this line
{
model: this.model,
parameters: {
// make it behave similar to openai, returning only the generated text
return_full_text: false,
temperature: this.temperature,
max_new_tokens: this.maxTokens,
top_p: this.topP,
top_k: this.topK,
repetition_penalty: this.frequencyPenalty,
},
inputs: prompt,
}
);
return res.generated_text;
} Please note that this is just one potential solution, and it may not work perfectly out of the box. The parameters and return values for different HuggingFace Inference API methods can vary, so you'll need to adjust the code accordingly based on the specific requirements of the model you're using. Also, I should mention that as an AI, I can only see text and cannot see images or links. So, I'm unable to view the screenshot you've attached. If you find this solution helpful and believe it's not covered by the existing documentation, I encourage you to make a pull request to update the documentation. This could be beneficial for other users who might encounter the same issue in the future. SourcesThis response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
-
Quickest fix would be to just allow passing in e.g. |
Beta Was this translation helpful? Give feedback.
-
Looks like Python has separated things out: https://github.com/langchain-ai/langchain/blob/88aff6f319c611723889e7d473b61aa20fe73ae4/libs/langchain/langchain/llms/huggingface_text_gen_inference.py Seems like a reasonable approach to me |
Beta Was this translation helpful? Give feedback.
-
Hey @hwchase17!
After being in my bucket list for a while, I am trying out Langchain and HuggingFace!
Was playing around with both and that is when I wanted to try the "facebook/bart-large-cnn" HF model.
By following the docs of LC for HF and I was constantly getting an error something like
"modelKwrags doesn't contains return_full_text".
After digging through code I noticed that the
.call()
method for HF always doeshf.textGeneration
I went through HF's docs - https://huggingface.co/docs/huggingface.js/inference/README#usage
and made the changes in the
dist
to suit my case, which worked.My question is - this definitely isn't the right way of doing it.
What am I missing it? Going through the docs and web didn't help me much here?
Could you please guide me here for the right approach?
Beta Was this translation helpful? Give feedback.
All reactions