Skip to content

Commit

Permalink
Add hugging face hub disclaimer and pick smaller model for local example
Browse files Browse the repository at this point in the history
  • Loading branch information
dylanholmes committed May 22, 2024
1 parent 8dba0d4 commit e99b82a
Showing 1 changed file with 10 additions and 1 deletion.
11 changes: 10 additions & 1 deletion docs/griptape-framework/drivers/prompt-drivers.md
Original file line number Diff line number Diff line change
Expand Up @@ -242,6 +242,15 @@ The [HuggingFaceHubPromptDriver](../../reference/griptape/drivers/prompt/hugging
- text2text-generation
- text-generation

!!! warning
Not all models featured on the Hugging Face Hub are supported by this driver. Models that are not supported by
[Hugging Face serverless inference](https://huggingface.co/docs/api-inference/en/index) will not work with this driver.
Due to the limitations of Hugging Face serverless inference, only models that are than 10GB are supported.

!!! info
The `prompt_stack_to_string_converter` function is intended to convert a `PromptStack` to model specific input. You
should consult the model's documentation to determine the correct format.

Let's recreate the [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct) example using Griptape:

```python
Expand Down Expand Up @@ -358,7 +367,7 @@ def prompt_stack_to_string_converter(prompt_stack: PromptStack) -> str:
agent = Agent(
config=StructureConfig(
prompt_driver=HuggingFacePipelinePromptDriver(
model="tiiuae/falcon-7b-instruct",
model="TinyLlama/TinyLlama-1.1B-Chat-v0.6",
prompt_stack_to_string=prompt_stack_to_string_converter,
)
),
Expand Down

0 comments on commit e99b82a

Please sign in to comment.