You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am getting the following error when I am running the graph_documents = llm_transformer.convert_to_graph_documents(documents) line.
[/home/archaudh/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/IPython/core/interactiveshell.py:3577](https://file+.vscode-resource.vscode-cdn.net/home/archaudh/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/IPython/core/interactiveshell.py:3577): LangChainDeprecationWarning: As of langchain-core 0.3.0, LangChain uses pydantic v2 internally. The langchain_core.pydantic_v1 module was a compatibility shim for pydantic v1, and should no longer be used. Please update the code to import from Pydantic directly.
For example, replace imports like: `from langchain_core.pydantic_v1 import BaseModel`
with: `from pydantic import BaseModel`
or the v1 compatibility namespace if you are working in a code base that has not been fully upgraded to pydantic 2 yet. from pydantic.v1 import BaseModel
exec(code_obj, self.user_global_ns, self.user_ns)
True
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[5], line 8
5 llm = ChatOpenAI(temperature=0, model="gpt-4o-mini")
6 llm_transformer = LLMGraphTransformer(llm=llm)
----> 8 graph_documents = llm_transformer.convert_to_graph_documents(documents)
File [~/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:809](https://file+.vscode-resource.vscode-cdn.net/home/archaudh/research/GraphRAG-with-Llama-3.1/~/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:809), in LLMGraphTransformer.convert_to_graph_documents(self, documents, config)
797 def convert_to_graph_documents(
798 self, documents: Sequence[Document], config: Optional[RunnableConfig] = None
799 ) -> List[GraphDocument]:
800 """Convert a sequence of documents into graph documents.
801
802 Args:
(...)
807 Sequence[GraphDocument]: The transformed documents as graphs.
808 """
--> 809 return [self.process_response(document, config) for document in documents]
File [~/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:809](https://file+.vscode-resource.vscode-cdn.net/home/archaudh/research/GraphRAG-with-Llama-3.1/~/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:809), in (.0)
797 def convert_to_graph_documents(
798 self, documents: Sequence[Document], config: Optional[RunnableConfig] = None
799 ) -> List[GraphDocument]:
800 """Convert a sequence of documents into graph documents.
801
802 Args:
(...)
807 Sequence[GraphDocument]: The transformed documents as graphs.
808 """
--> 809 return [self.process_response(document, config) for document in documents]
File [~/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:754](https://file+.vscode-resource.vscode-cdn.net/home/archaudh/research/GraphRAG-with-Llama-3.1/~/anaconda3/envs/graphrag-tutorial/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py:754), in LLMGraphTransformer.process_response(self, document, config)
750 parsed_json = [parsed_json]
751 for rel in parsed_json:
752 # Check if mandatory properties are there
753 if (
--> 754 not rel.get("head")
755 or not rel.get("tail")
756 or not rel.get("relation")
757 ):
758 continue
759 # Nodes need to be deduplicated using a set
760 # Use default Node label for nodes if `missing`
AttributeError: 'list' object has no attribute 'get'
The text was updated successfully, but these errors were encountered:
I am getting the following error when I am running the
graph_documents = llm_transformer.convert_to_graph_documents(documents)
line.The text was updated successfully, but these errors were encountered: