v2.11.0
2.11.0
This release notably includes a significant UI improvement for the chat side panel. The chat UI now uses the native JupyterLab frontend to render Markdown, code blocks, and TeX markup instead of a third party package. Thank you to @andrii-i for building this feature!
Enhancements made
- Fix cookiecutter template #637 (@dlqqq)
- Add OpenAI text-embedding-3-small, -large models #628 (@JasonWeill)
- Add new OpenAI models #625 (@EduardDurech)
- Use @jupyterlab/rendermime for in-chat markdown rendering #564 (@andrii-i)
Bugs fixed
- Unifies parameters to instantiate llm while incorporating model params #632 (@JasonWeill)
Documentation improvements
- Add
nodejs=20
to the contributing docs #645 (@jtpio) - Update docs to mention
langchain_community.llms
#642 (@jtpio) - Fix cookiecutter template #637 (@dlqqq)
- Fix conda-forge typo in readme #626 (@droumis)
Contributors to this release
(GitHub contributors page for this release)
@andrii-i | @dlqqq | @droumis | @EduardDurech | @JasonWeill | @jtpio | @krassowski | @lalanikarim | @lumberbot-app | @welcome | @Wzixiao