Class: ContextChatEngine
ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.
Extends
Implements
Constructors
new ContextChatEngine()
new ContextChatEngine(
init
):ContextChatEngine
Parameters
• init
• init.chatHistory?: ChatMessage
[]
• init.chatModel?: LLM
<object
, object
>
• init.contextRole?: MessageType
• init.contextSystemPrompt?
• init.nodePostprocessors?: BaseNodePostprocessor
[]
• init.retriever: BaseRetriever
• init.systemPrompt?: string
Returns
Overrides
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:35
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:31
chatModel
chatModel:
LLM
<object
,object
>
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:30
contextGenerator
contextGenerator:
ContextGenerator
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:32
systemPrompt?
optional
systemPrompt:string
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:33
Methods
_getPromptModules()
protected
_getPromptModules():Record
<string
,ContextGenerator
>
Returns
Record
<string
, ContextGenerator
>
Overrides
PromptMixin
. _getPromptModules
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:56
_getPrompts()
protected
_getPrompts():PromptsDict
Returns
PromptsDict
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:78
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:86
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
>>
Implementation of
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:62
chat(params)
chat(
params
):Promise
<EngineResponse
>
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>
Implementation of
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:65
getPrompts()
getPrompts():
PromptsDict
Returns all prompts from the mixin and its modules
Returns
PromptsDict
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:27
prepareRequestMessages()
private
prepareRequestMessages(message
,chatHistory
):Promise
<object
>
Parameters
• message: MessageContent
• chatHistory: ChatHistory
<object
>
Returns
Promise
<object
>
messages
messages:
ChatMessage
<object
>[]
nodes
nodes:
NodeWithScore
<Metadata
>[] =context.nodes
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:106
prependSystemPrompt()
private
prependSystemPrompt(message
):ChatMessage
Parameters
• message: ChatMessage
Returns
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:121
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Implementation of
Source
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:102
updatePrompts()
updatePrompts(
promptsDict
):void
Updates the prompts in the mixin and its modules
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:48
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Validates the prompt keys and module keys
Parameters
• promptsDict: PromptsDict
• moduleDict: ModuleDict
Returns
void