Releases: rgbkrk/chatlab
Releases · rgbkrk/chatlab
v0.14.1
Fixed
- One line in the changelog was off
- Testing the release workflow
v0.14.0
Added
- 🏷️ New
expose_exception_to_llm
decorator allows function exceptions to be exposed to the local language model. This is useful for when the model is running code via IPython'srun_cell
or any other interpreter where the model needs feedback on exceptions.
Changed
- 🔄 Package name changed from
murkrow
tochatlab
! 💬🔬 - 🤓 Simplified the
register
methods of theConversation
andFunctionRegistry
classes. The parametersparameters_model
andjson_schema
are replaced by a single parameterparameter_schema
, which can be a pydantic model or a JSON schema. This streamlines and simplifies the function registration process by accepting both pydantic models and JSON schema as parameter schemas in a single argument instead of two separate arguments. This reduces ambiguity and simplifies the implementation. - 💪🏻 Improved typing for messaging
- 📝 Documentation improvements
- 📜 When outputs and inputs are too big, allow scrolling instead of overflowing
- 🔐 Check for
OPENAI_API_KEY
onConversation
creation
Fixed
- 🐛 Fixed the run_cell builtin to actually return the result. This reintroduces side effects of display output, meaning outputs from run_cell will now appear in the notebook and be visible to the Language Model as part of the run.
- ✅ Extended type for parameters_model is now correctly
Optional[Type["BaseModel"]]
so that you can extend a model for parameters in your own typed Python code. This is now mypy compliant.
Removed
- 🚗 Took out the
auto_continue
option since it only applied to function calls and generally should beTrue
for function call responses
v0.13.0
Added
- 🐍 Include a builtin
python
chat function to handle the model's hallucination ofpython
being an available chat function. Enable it withallow_hallucinated_python
to theConversation
or theFunctionRegistry
. NOTE: it runs in the same runtime as theConversation
and will be used to execute arbitrary code. Use with caution. Or delight. - 🤩 Auto infer schemas for functions. Run
session.register(function)
. This is a great way to get started quickly. Note: you will get better results in some cases by using You may still get better results out of using pydantic models since those can have descriptions and other metadata in the resulting JSON schema. - 🆕 Accept functions with a JSON Schema for Function calling. This should make functions portable across any other libraries are are accepting the OpenAI standard for function calling.
Changed
Session
has been renamed toConversation
to be more understandable.Session
will have a deprecation warning until it is removed for 1.0.0.chat
has been renamed tosubmit
to better reflect that it's sending the current batch of messages on.chat
will have a deprecation warning until it is removed for 1.0.0- Shifted some errors that bubbled up as exceptions to the end user to instead be
system
messages for the LLM
Removed
- Removed
deltas
iterator forStreamCompletion
, favoring the new conversations API instead.
v0.12.3
Fixed
- Don't emit empty assistant messages
v0.12.2
Added
- Updated README with more documentation
v0.12.1
Fixed
- Fixed a bug where zero functions would create an
InvalidRequestError: [] is too short - 'functions'
v0.12.0
Added
- A little chat function displayer
v0.11.4
Added
- Created a simple OpenAI chat interface for use in interactive computing environments