Skip to content

Commit

Permalink
Add initialization of llm parameter to None in ContextChatEngine cons…
Browse files Browse the repository at this point in the history
…tructor
  • Loading branch information
Thomas-AH-Heller committed Feb 27, 2024
1 parent 24768bf commit 74f1d5a
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions llama-index-core/llama_index/core/chat_engine/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ def from_defaults(
prefix_messages: Optional[List[ChatMessage]] = None,
node_postprocessors: Optional[List[BaseNodePostprocessor]] = None,
context_template: Optional[str] = None,
llm: Optional[LLM] = None,
**kwargs: Any,
) -> "ContextChatEngine":
"""Initialize a ContextChatEngine from default parameters."""
Expand Down

0 comments on commit 74f1d5a

Please sign in to comment.