Skip to content

Commit

Permalink
[version bump] v0.10.6 (#10930)
Browse files Browse the repository at this point in the history
  • Loading branch information
logan-markewich authored Feb 18, 2024
1 parent 8118a45 commit 8b8199f
Show file tree
Hide file tree
Showing 15 changed files with 968 additions and 807 deletions.
25 changes: 25 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,30 @@
# ChangeLog

## [0.10.6] - 2024-02-17

First, appologies for missing the changelog the last few versions. Trying to figure out the best process with 400+ packages.

At some point, each package will have a dedicated changelog.

But for now, onto the "master" changelog.

### New Features

- Added `NomicHFEmbedding` (#10762)
- Added `MinioReader` (#10744)

### Bug Fixes / Nits

- Various fixes for clickhouse vector store (#10799)
- Fix index name in neo4j vector store (#10749)
- Fixes to sagemaker embeddings (#10778)
- Fixed performance issues when splitting nodes (#10766)
- Fix non-float values in reranker + b25 (#10930)
- OpenAI-agent should be a dep of openai program (#10930)
- Add missing shortcut imports for query pipeline components (#10930)
- Fix NLTK and tiktoken not being bundled properly with core (#10930)
- Add back `llama_index.core.__version__` (#10930)

## [0.10.3] - 2024-02-13

### Bug Fixes / Nits
Expand Down
1 change: 1 addition & 0 deletions llama-index-core/llama_index/core/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""Init file of LlamaIndex."""
__version__ = "0.10.6"

import logging
from logging import NullHandler
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
from llama_index.core.query_pipeline.components.agent import (
AgentFnComponent,
AgentInputComponent,
BaseAgentComponent,
CustomAgentComponent,
)
from llama_index.core.query_pipeline.components.argpacks import ArgPackComponent
from llama_index.core.query_pipeline.components.function import (
FnComponent,
FunctionComponent,
)
from llama_index.core.query_pipeline.components.input import InputComponent
from llama_index.core.query_pipeline.components.router import (
RouterComponent,
SelectorComponent,
)
from llama_index.core.query_pipeline.components.tool_runner import ToolRunnerComponent

__all__ = [
"AgentFnComponent",
"AgentInputComponent",
"BaseAgentComponent",
"CustomAgentComponent",
"ArgPackComponent",
"FnComponent",
"FunctionComponent",
"InputComponent",
"RouterComponent",
"SelectorComponent",
"ToolRunnerComponent",
]
4 changes: 2 additions & 2 deletions llama-index-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ classifiers = [
description = "Interface between LLMs and your data"
documentation = "https://docs.llamaindex.ai/en/stable/"
homepage = "https://llamaindex.ai"
include = ["llama_index/core/_static"]
include = ["llama_index/core/_static/nltk_cache/corpora/stopwords/*", "llama_index/core/_static/nltk_cache/tokenizers/punkt/*", "llama_index/core/_static/nltk_cache/tokenizers/punkt/PY3/*", "llama_index/core/_static/tiktoken_cache/*"]
keywords = ["LLM", "NLP", "RAG", "data", "devtools", "index", "retrieval"]
license = "MIT"
maintainers = [
Expand All @@ -42,7 +42,7 @@ name = "llama-index-core"
packages = [{include = "llama_index"}]
readme = "README.md"
repository = "https://github.com/run-llama/llama_index"
version = "0.10.5"
version = "0.10.6"

[tool.poetry.dependencies]
SQLAlchemy = {extras = ["asyncio"], version = ">=1.4.49"}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ description = "llama-index embeddings nomic integration"
license = "MIT"
name = "llama-index-embeddings-nomic"
readme = "README.md"
version = "0.1.3"
version = "0.1.4"

[tool.poetry.dependencies]
python = ">=3.8.1,<3.12"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ description = "llama-index embeddings sagemaker endpoint integration"
license = "MIT"
name = "llama-index-embeddings-sagemaker-endpoint"
readme = "README.md"
version = "0.1.1"
version = "0.1.2"

[tool.poetry.dependencies]
python = ">=3.8.1,<3.12"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ def _postprocess_nodes(
if self.keep_retrieval_score:
# keep the retrieval score in metadata
node.node.metadata["retrieval_score"] = node.score
node.score = score
node.score = float(score)

new_nodes = sorted(nodes, key=lambda x: -x.score if x.score else 0)[
: self.top_n
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ description = "llama-index postprocessor sbert rerank integration"
license = "MIT"
name = "llama-index-postprocessor-sbert-rerank"
readme = "README.md"
version = "0.1.1"
version = "0.1.2"

[tool.poetry.dependencies]
python = ">=3.8.1,<3.12"
Expand Down
Loading

0 comments on commit 8b8199f

Please sign in to comment.