-
Notifications
You must be signed in to change notification settings - Fork 183
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'main' into releases/rc-trulens-eval-0.20.3
- Loading branch information
Showing
4 changed files
with
145 additions
and
53 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
# Optional Packages | ||
|
||
Most of the examples included within `trulens_eval` require additional packages | ||
not installed alongside `trulens_eval`. You may be prompted to install them | ||
(with pip). The requirements file `trulens_eval/requirements.optional.txt` | ||
contains the list of optional packages and their use if you'd like to install | ||
them all in one go. | ||
|
||
## Dev Notes | ||
|
||
To handle optional packages and provide clearer instuctions to the user, we | ||
employ a context-manager-based scheme (see `utils/imports.py`) to import | ||
packages that may not be installed. The basic form of such imports can be seen | ||
in `__init__.py`: | ||
|
||
```python | ||
with OptionalImports(messages=REQUIREMENT_LLAMA): | ||
from trulens_eval.tru_llama import TruLlama | ||
``` | ||
|
||
This makes it so that `TruLlama` gets defined subsequently even if the import | ||
fails (because `tru_llama` imports `llama_index` which may not be installed). | ||
However, if the user imports TruLlama (via `__init__.py`) and tries to use it | ||
(call it, look up attribute, etc), the will be presented a message telling them | ||
that `llama-index` is optional and how to install it: | ||
|
||
``` | ||
ModuleNotFoundError: | ||
llama-index package is required for instrumenting llama_index apps. | ||
You should be able to install it with pip: | ||
pip install "llama-index>=v0.9.14.post3" | ||
``` | ||
|
||
If a user imports directly from TruLlama (not by way of `__init__.py`), they | ||
will get that message immediately instead of upon use due to this line inside | ||
`tru_llama.py`: | ||
|
||
```python | ||
OptionalImports(messages=REQUIREMENT_LLAMA).assert_installed(llama_index) | ||
``` | ||
|
||
This checks that the optional import system did not return a replacement for | ||
`llama_index` (under a context manager earlier in the file). | ||
|
||
### When to Fail | ||
|
||
As per above implied, imports from a general package that does not imply an | ||
optional package (like `from trulens_eval ...`) should not produce the error | ||
immediately but imports from packages that do imply the use of optional import | ||
(`tru_llama.py`) should. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters