Class: ContextChatEngine
ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.
Extends
Implements
Constructors
new ContextChatEngine()
new ContextChatEngine(
init
):ContextChatEngine
Parameters
• init
• init.chatHistory?: ChatMessage
[]
• init.chatModel?: LLM
<object
, object
>
• init.contextRole?: MessageType
• init.contextSystemPrompt?
• init.nodePostprocessors?: BaseNodePostprocessor
[]
• init.retriever: BaseRetriever
• init.systemPrompt?: string
Returns
Overrides
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:35
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:31
chatModel
chatModel:
LLM
<object
,object
>
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:30
contextGenerator
contextGenerator:
ContextGenerator
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:32
systemPrompt?
optional
systemPrompt:string
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:33
Methods
_getPromptModules()
protected
_getPromptModules():Record
<string
,ContextGenerator
>
Returns
Record
<string
, ContextGenerator
>
Overrides
PromptMixin
. _getPromptModules
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:56
_getPrompts()
protected
_getPrompts():PromptsDict
Returns
PromptsDict
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:78
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:86
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
>>
Implementation of
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:62
chat(params)
chat(
params
):Promise
<EngineResponse
>
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>