-
Notifications
You must be signed in to change notification settings - Fork 453
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Added LlamaIndex integration with Neo4j #2325
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mahimairaja Just two quick things:
- Could you add a demo video of running the notebook in the Naas lab environment? The notebook flow is correct and as expected.
- Also make sure to add
import requests
in the Import libraries section.
Thank you for the response @srini047, Will make it asap |
Great, Looking forward to it. |
Hi @srini047, I just tested the notebook with the google colab Free version, it But, I am facing some conflicts while running the same in the naas environment. I am not able to install llama_index. Kindly help me to proceed further. Thanks. Error: TypeError: multiple bases have instance lay-out conflict Click to toggle Full `Error Message`---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-6-b6c670793068> in <module>
1 try:
----> 2 import llama_index
3 except ModuleNotFoundError:
4 get_ipython().system('pip install -q llama-index')
5 import llama_index
/opt/conda/lib/python3.9/site-packages/llama_index/__init__.py in <module>
15
16 # embeddings
---> 17 from llama_index.embeddings.langchain import LangchainEmbedding
18 from llama_index.embeddings.openai import OpenAIEmbedding
19
/opt/conda/lib/python3.9/site-packages/llama_index/embeddings/__init__.py in <module>
14 from llama_index.embeddings.huggingface_utils import DEFAULT_HUGGINGFACE_EMBEDDING_MODEL
15 from llama_index.embeddings.instructor import InstructorEmbedding
---> 16 from llama_index.embeddings.langchain import LangchainEmbedding
17 from llama_index.embeddings.llm_rails import LLMRailsEmbeddings
18 from llama_index.embeddings.openai import OpenAIEmbedding
/opt/conda/lib/python3.9/site-packages/llama_index/embeddings/langchain.py in <module>
3 from typing import List, Optional
4
----> 5 from llama_index.bridge.langchain import Embeddings as LCEmbeddings
6 from llama_index.bridge.pydantic import PrivateAttr
7 from llama_index.callbacks import CallbackManager
/opt/conda/lib/python3.9/site-packages/llama_index/bridge/langchain.py in <module>
1 import langchain
----> 2 from langchain.agents import AgentExecutor, AgentType, initialize_agent
3
4 # agents and tools
5 from langchain.agents.agent_toolkits.base import BaseToolkit
~/.local/lib/python3.9/site-packages/langchain/agents/__init__.py in <module>
29
30 """ # noqa: E501
---> 31 from langchain.agents.agent import (
32 Agent,
33 AgentExecutor,
~/.local/lib/python3.9/site-packages/langchain/agents/agent.py in <module>
21 import yaml
22
---> 23 from langchain.agents.agent_iterator import AgentExecutorIterator
24 from langchain.agents.agent_types import AgentType
25 from langchain.agents.tools import InvalidTool
~/.local/lib/python3.9/site-packages/langchain/agents/agent_iterator.py in <module>
19 )
20
---> 21 from langchain.callbacks.manager import (
22 AsyncCallbackManager,
23 AsyncCallbackManagerForChainRun,
~/.local/lib/python3.9/site-packages/langchain/callbacks/__init__.py in <module>
8 """
9
---> 10 from langchain.callbacks.aim_callback import AimCallbackHandler
11 from langchain.callbacks.argilla_callback import ArgillaCallbackHandler
12 from langchain.callbacks.arize_callback import ArizeCallbackHandler
~/.local/lib/python3.9/site-packages/langchain/callbacks/aim_callback.py in <module>
3
4 from langchain.callbacks.base import BaseCallbackHandler
----> 5 from langchain.schema import AgentAction, AgentFinish, LLMResult
6
7
~/.local/lib/python3.9/site-packages/langchain/schema/__init__.py in <module>
1 """**Schemas** are the LangChain Base Classes and Interfaces."""
----> 2 from langchain.schema.agent import AgentAction, AgentFinish
3 from langchain.schema.cache import BaseCache
4 from langchain.schema.chat_history import BaseChatMessageHistory
5 from langchain.schema.document import BaseDocumentTransformer, Document
~/.local/lib/python3.9/site-packages/langchain/schema/agent.py in <module>
4
5 from langchain.load.serializable import Serializable
----> 6 from langchain.schema.messages import BaseMessage
7
8
~/.local/lib/python3.9/site-packages/langchain/schema/messages.py in <module>
155
156
--> 157 class HumanMessageChunk(HumanMessage, BaseMessageChunk):
158 """A Human Message chunk."""
159
/opt/conda/lib/python3.9/site-packages/pydantic/main.cpython-39-x86_64-linux-gnu.so in pydantic.main.ModelMetaclass.__new__()
/opt/conda/lib/python3.9/abc.py in __new__(mcls, name, bases, namespace, **kwargs)
104 """
105 def __new__(mcls, name, bases, namespace, **kwargs):
--> 106 cls = super().__new__(mcls, name, bases, namespace, **kwargs)
107 _abc_init(cls)
108 return cls
TypeError: multiple bases have instance lay-out conflict |
@mahimairaja Just add how to generate the Neo4j graph instance code. |
Thanks for letting know, I have added now, please check it |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM🚀
Nice work @mahimairaja. @FlorentLvr could have a final look.
Kindly share your valuable comments @FlorentLvr |
The template is now available on the master branch on this link: |
Thank you for your contribution @mahimairaja, your PR has been merged into the master branch of awesome-notebook. |
Fixes
This PR resolves #2268
What does this PR do?
This PR adds the notebook to demonstrate the integration of LlamaIndex with Neo4j
Screenshots
Overall Architecture
Building - Knowledge Graph
Limitation:
Building the Knowledge graph for various data might be difficult and each data connectors needs an unique approach.