Skip to content

Commit

Permalink
v0.10.49 (#14328)
Browse files Browse the repository at this point in the history
  • Loading branch information
logan-markewich authored Jun 23, 2024
1 parent 35dbacc commit 3ca7e9d
Show file tree
Hide file tree
Showing 9 changed files with 99 additions and 21 deletions.
31 changes: 31 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,36 @@
# ChangeLog

## [2024-06-23]

### `llama-index-core` [0.10.49]

- Improvements to `llama-cloud` and client dependencies (#14254)

### `llama-index-indices-managed-llama-cloud` [0.2.1]

- Improve the interface and client interactions in `LlamaCloudIndex` (#14254)

### `llama-index-llms-bedrock-converse` [0.1.3]

- add claude sonnet 3.5 to bedrock converse (#14306)

### `llama-index-llms-upstage` [0.1.2]

- set default context size (#14293)
- add api_key alias on upstage llm and embeddings (#14233)

### `llama-index-storage-kvstore-azure` [0.1.2]

- Optimized inserts (#14321)

### `llama-index-utils-azure` [0.1.1]

- azure_table_storage params bug (#14182)

### `llama-index-vector-stores-neo4jvector` [0.1.6]

- Add neo4j client method (#14314)

## [2024-06-21]

### `llama-index-core` [0.10.48]
Expand Down
31 changes: 31 additions & 0 deletions docs/docs/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,36 @@
# ChangeLog

## [2024-06-23]

### `llama-index-core` [0.10.49]

- Improvements to `llama-cloud` and client dependencies (#14254)

### `llama-index-indices-managed-llama-cloud` [0.2.1]

- Improve the interface and client interactions in `LlamaCloudIndex` (#14254)

### `llama-index-llms-bedrock-converse` [0.1.3]

- add claude sonnet 3.5 to bedrock converse (#14306)

### `llama-index-llms-upstage` [0.1.2]

- set default context size (#14293)
- add api_key alias on upstage llm and embeddings (#14233)

### `llama-index-storage-kvstore-azure` [0.1.2]

- Optimized inserts (#14321)

### `llama-index-utils-azure` [0.1.1]

- azure_table_storage params bug (#14182)

### `llama-index-vector-stores-neo4jvector` [0.1.6]

- Add neo4j client method (#14314)

## [2024-06-21]

### `llama-index-core` [0.10.48]
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/module_guides/indexing/lpg_index_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -214,7 +214,7 @@ Explicitly declaring the retriever allows you to customize several options. Here
```python
from llama_index.core.indices.property_graph import LLMSynonymRetriever

DEFAULT_SYNONYM_EXPAND_TEMPLATE = (
prompt = (
"Given some initial query, generate synonyms or related keywords up to {max_keywords} in total, "
"considering possible cases of capitalization, pluralization, common expressions, etc.\n"
"Provide all synonyms/keywords separated by '^' symbols: 'keyword1^keyword2^...'\n"
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/understanding/putting_it_all_together/apps/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ LlamaIndex can be integrated into a downstream full-stack web application. It ca

We provide tutorials and resources to help you get started in this area:

- [Fullstack Application Guide](apps/fullstack_app_guide.md) shows you how to build an app with LlamaIndex as an API and a TypeScript+React frontend
- [Fullstack Application with Delphic](apps/fullstack_with_delphic.md) walks you through using LlamaIndex with a production-ready web app starter template called Delphic.
- [Fullstack Application Guide](./fullstack_app_guide.md) shows you how to build an app with LlamaIndex as an API and a TypeScript+React frontend
- [Fullstack Application with Delphic](./fullstack_with_delphic.md) walks you through using LlamaIndex with a production-ready web app starter template called Delphic.
- The [LlamaIndex Starter Pack](https://github.com/logan-markewich/llama_index_starter_pack) provides very basic flask, streamlit, and docker examples for LlamaIndex.
1 change: 1 addition & 0 deletions docs/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -872,6 +872,7 @@ nav:
- ./api_reference/llms/huggingface.md
- ./api_reference/llms/huggingface_api.md
- ./api_reference/llms/ibm.md
- ./api_reference/llms/ibm_watsonx.md
- ./api_reference/llms/index.md
- ./api_reference/llms/ipex_llm.md
- ./api_reference/llms/konko.md
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/llama_index/core/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""Init file of LlamaIndex."""

__version__ = "0.10.48"
__version__ = "0.10.49"

import logging
from logging import NullHandler
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ name = "llama-index-core"
packages = [{include = "llama_index"}]
readme = "README.md"
repository = "https://github.com/run-llama/llama_index"
version = "0.10.48.post1"
version = "0.10.49"

[tool.poetry.dependencies]
SQLAlchemy = {extras = ["asyncio"], version = ">=1.4.49"}
Expand Down
43 changes: 29 additions & 14 deletions poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ name = "llama-index"
packages = [{from = "_llama-index", include = "llama_index"}]
readme = "README.md"
repository = "https://github.com/run-llama/llama_index"
version = "0.10.48.post1"
version = "0.10.49"

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
Expand All @@ -57,7 +57,7 @@ llama-index-agent-openai = ">=0.1.4,<0.3.0"
llama-index-readers-file = "^0.1.4"
llama-index-readers-llama-parse = "^0.1.2"
llama-index-indices-managed-llama-cloud = "^0.1.2"
llama-index-core = "0.10.48"
llama-index-core = "0.10.49"
llama-index-multi-modal-llms-openai = "^0.1.3"
llama-index-cli = "^0.1.2"

Expand Down

0 comments on commit 3ca7e9d

Please sign in to comment.