-
Notifications
You must be signed in to change notification settings - Fork 99
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Write documentation for the HuggingFace LLM
- Loading branch information
1 parent
9c91db7
commit 3189676
Showing
5 changed files
with
308 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,100 @@ | ||
= Hugging Face | ||
|
||
include::./includes/attributes.adoc[] | ||
|
||
https://huggingface.co/[Hugging Face] is a leading platform in the field of natural language processing (NLP) that provides a comprehensive collection of pre-trained language models. Hugging Face facilitates easy access to a wide range of state-of-the-art models for various NLP tasks. | ||
Its focus on democratizing access to cutting-edge NLP capabilities has made Hugging Face a pivotal player in the advancement of language technology. | ||
|
||
== Using Hugging Face models | ||
|
||
To employ Hugging Face LLMs, integrate the following dependency into your project: | ||
|
||
[source,xml,subs=attributes+] | ||
---- | ||
<dependency> | ||
<groupId>io.quarkiverse.langchain4j</groupId> | ||
<artifactId>quarkus-langchain4j-hugging-face</artifactId> | ||
<version>{project-version}</version> | ||
</dependency> | ||
---- | ||
|
||
If no other LLM extension is installed, link:../ai-services.adoc[AI Services] will automatically utilize the configured Hugging Face model. | ||
|
||
IMPORTANT: Hugging Face provides multiple kind of models. We only support **text-to-text** models, which are models that take a text as input and return a text as output. | ||
|
||
By default, the extension uses: | ||
|
||
- https://huggingface.co/tiiuae/falcon-7b-instruct[titiuas/falcon-7b-instruct] as chat model (inference endpoint: _https://api-inference.huggingface.co/models/tiiuae/falcon-7b-instruct_) | ||
- https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2[sentence-transformers/all-MiniLM-L6-v2] as embedding model (inference endpoint: _https://api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-MiniLM-L6-v2_) | ||
|
||
=== Configuration | ||
|
||
Configuring Hugging Face models mandates an API key, obtainable by creating an account on the Hugging Face platform. | ||
|
||
The API key can be set in the `application.properties` file: | ||
|
||
[source,properties,subs=attributes+] | ||
---- | ||
quarkus.langchain4j.huggingface.api-key=hf-... | ||
---- | ||
|
||
TIP: Alternatively, leverage the `QUARKUS_LANGCHAIN4J_HUGGINGFACE_API_KEY` environment variable. | ||
|
||
Several configuration properties are available: | ||
|
||
include::includes/quarkus-langchain4j-huggingface.adoc[leveloffset=+1,opts=optional] | ||
|
||
== Configuring the chat model | ||
|
||
You can change the chat model by setting the `quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url` property. | ||
When using a model hosted on Hugging Face, the property should be set to: `https://api-inference.huggingface.co/models/<model-id>`. | ||
|
||
For example, to use the `google/flan-t5-small` model, set: | ||
|
||
[source, properties] | ||
---- | ||
quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url=https://api-inference.huggingface.co/models/google/flan-t5-small | ||
---- | ||
|
||
Remember that only text to text models are supported. | ||
|
||
== Using inference endpoints and local models | ||
|
||
Hugging Face models can be deployed to provide inference endpoints. | ||
In this case, configure the `quarkus.langchain4j.huggingface.inference-endpoint-url` property to point to the endpoint URL: | ||
|
||
[source,properties,subs=attributes+] | ||
---- | ||
quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url=https://j9dkyuliy170f3ia.us-east-1.aws.endpoints.huggingface.cloud | ||
---- | ||
|
||
If you run a model locally, adapt the URL accordingly: | ||
|
||
[source,properties,subs=attributes+] | ||
---- | ||
quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url=http://localhost:8085 | ||
---- | ||
|
||
== Document Retriever and Embedding | ||
|
||
When utilizing Hugging Face models, the recommended practice involves leveraging the `EmbeddingModel` provided by Hugging Face. | ||
|
||
. If no other LLM extension is installed, retrieve the embedding model as follows: | ||
|
||
[source, java] | ||
---- | ||
@Inject EmbeddingModel model; // Injects the embedding model | ||
---- | ||
|
||
You can configure the model using: | ||
|
||
[source, properties] | ||
---- | ||
quarkus.langchain4j.huggingface.embedding-model.inference-endpoint-url=https://api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-MiniLM-L6-v2 | ||
---- | ||
|
||
WARNING: Not every sentence transformers are supported by the embedding model. If you want to use a custom sentence transformers, you need to create your own embedding model. | ||
|
||
== Tools | ||
|
||
The Hugging Face LLMs do not support tools. |
195 changes: 195 additions & 0 deletions
195
docs/modules/ROOT/pages/includes/quarkus-langchain4j-huggingface.adoc
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,195 @@ | ||
|
||
:summaryTableId: quarkus-langchain4j-huggingface | ||
[.configuration-legend] | ||
icon:lock[title=Fixed at build time] Configuration property fixed at build time - All other configuration properties are overridable at runtime | ||
[.configuration-reference.searchable, cols="80,.^10,.^10"] | ||
|=== | ||
|
||
h|[[quarkus-langchain4j-huggingface_configuration]]link:#quarkus-langchain4j-huggingface_configuration[Configuration property] | ||
|
||
h|Type | ||
h|Default | ||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.api-key]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.api-key[quarkus.langchain4j.huggingface.api-key]` | ||
|
||
|
||
[.description] | ||
-- | ||
HuggingFace API key | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_API_KEY+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_API_KEY+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|string | ||
| | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.timeout]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.timeout[quarkus.langchain4j.huggingface.timeout]` | ||
|
||
|
||
[.description] | ||
-- | ||
Timeout for HuggingFace calls | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_TIMEOUT+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_TIMEOUT+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|link:https://docs.oracle.com/javase/8/docs/api/java/time/Duration.html[Duration] | ||
link:#duration-note-anchor-{summaryTableId}[icon:question-circle[], title=More information about the Duration format] | ||
|`10S` | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url[quarkus.langchain4j.huggingface.chat-model.inference-endpoint-url]` | ||
|
||
|
||
[.description] | ||
-- | ||
The URL of the inference endpoint for the chat model. | ||
|
||
When using Hugging Face with the inference API, the URL is `https://api-inference.huggingface.co/models/<model-id>`, for example `https://api-inference.huggingface.co/models/google/flan-t5-small`. | ||
|
||
When using a deployed inference endpoint, the URL is the URL of the endpoint. When using a local hugging face model, the URL is the URL of the local model. | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_INFERENCE_ENDPOINT_URL+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_INFERENCE_ENDPOINT_URL+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|link:https://docs.oracle.com/javase/8/docs/api/java/net/URL.html[URL] | ||
|
||
|`https://api-inference.huggingface.co/models/tiiuae/falcon-7b-instruct` | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.temperature]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.temperature[quarkus.langchain4j.huggingface.chat-model.temperature]` | ||
|
||
|
||
[.description] | ||
-- | ||
Float (0.0-100.0). The temperature of the sampling operation. 1 means regular sampling, 0 means always take the highest score, 100.0 is getting closer to uniform probability | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_TEMPERATURE+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_TEMPERATURE+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|double | ||
|`1.0` | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.max-new-tokens]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.max-new-tokens[quarkus.langchain4j.huggingface.chat-model.max-new-tokens]` | ||
|
||
|
||
[.description] | ||
-- | ||
Int (0-250). The amount of new tokens to be generated, this does not include the input length it is a estimate of the size of generated text you want. Each new tokens slows down the request, so look for balance between response times and length of text generated | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_MAX_NEW_TOKENS+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_MAX_NEW_TOKENS+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|int | ||
| | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.return-full-text]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.return-full-text[quarkus.langchain4j.huggingface.chat-model.return-full-text]` | ||
|
||
|
||
[.description] | ||
-- | ||
If set to `false`, the return results will not contain the original query making it easier for prompting | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_RETURN_FULL_TEXT+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_RETURN_FULL_TEXT+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|boolean | ||
| | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.wait-for-model]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.chat-model.wait-for-model[quarkus.langchain4j.huggingface.chat-model.wait-for-model]` | ||
|
||
|
||
[.description] | ||
-- | ||
If the model is not ready, wait for it instead of receiving 503. It limits the number of requests required to get your inference done. It is advised to only set this flag to true after receiving a 503 error as it will limit hanging in your application to known places | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_WAIT_FOR_MODEL+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_CHAT_MODEL_WAIT_FOR_MODEL+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|boolean | ||
|`true` | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.embedding-model.inference-endpoint-url]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.embedding-model.inference-endpoint-url[quarkus.langchain4j.huggingface.embedding-model.inference-endpoint-url]` | ||
|
||
|
||
[.description] | ||
-- | ||
The URL of the inference endpoint for the embedding. | ||
|
||
When using Hugging Face with the inference API, the URL is `https://api-inference.huggingface.co/pipeline/feature-extraction/<model-id>`, for example `https://api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-mpnet-base-v2`. | ||
|
||
When using a deployed inference endpoint, the URL is the URL of the endpoint. When using a local hugging face model, the URL is the URL of the local model. | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_EMBEDDING_MODEL_INFERENCE_ENDPOINT_URL+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_EMBEDDING_MODEL_INFERENCE_ENDPOINT_URL+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|link:https://docs.oracle.com/javase/8/docs/api/java/net/URL.html[URL] | ||
|
||
|`https://api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-MiniLM-L6-v2` | ||
|
||
|
||
a| [[quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.embedding-model.wait-for-model]]`link:#quarkus-langchain4j-huggingface_quarkus.langchain4j.huggingface.embedding-model.wait-for-model[quarkus.langchain4j.huggingface.embedding-model.wait-for-model]` | ||
|
||
|
||
[.description] | ||
-- | ||
If the model is not ready, wait for it instead of receiving 503. It limits the number of requests required to get your inference done. It is advised to only set this flag to true after receiving a 503 error as it will limit hanging in your application to known places | ||
|
||
ifdef::add-copy-button-to-env-var[] | ||
Environment variable: env_var_with_copy_button:+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_EMBEDDING_MODEL_WAIT_FOR_MODEL+++[] | ||
endif::add-copy-button-to-env-var[] | ||
ifndef::add-copy-button-to-env-var[] | ||
Environment variable: `+++QUARKUS_LANGCHAIN4J_HUGGINGFACE_EMBEDDING_MODEL_WAIT_FOR_MODEL+++` | ||
endif::add-copy-button-to-env-var[] | ||
--|boolean | ||
|`true` | ||
|
||
|=== | ||
ifndef::no-duration-note[] | ||
[NOTE] | ||
[id='duration-note-anchor-{summaryTableId}'] | ||
.About the Duration format | ||
==== | ||
To write duration values, use the standard `java.time.Duration` format. | ||
See the link:https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/time/Duration.html#parse(java.lang.CharSequence)[Duration#parse() javadoc] for more information. | ||
|
||
You can also use a simplified format, starting with a number: | ||
|
||
* If the value is only a number, it represents time in seconds. | ||
* If the value is a number followed by `ms`, it represents time in milliseconds. | ||
In other cases, the simplified format is translated to the `java.time.Duration` format for parsing: | ||
|
||
* If the value is a number followed by `h`, `m`, or `s`, it is prefixed with `PT`. | ||
* If the value is a number followed by `d`, it is prefixed with `P`. | ||
==== | ||
endif::no-duration-note[] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters