Class: CondenseQuestionChatEngine
CondenseQuestionChatEngine is used in conjunction with a Index (for example VectorStoreIndex). It does two steps on taking a user's chat message: first, it condenses the chat message with the previous chat history into a question with more context. Then, it queries the underlying Index using the new question with context and returns the response. CondenseQuestionChatEngine performs well when the input is primarily questions about the underlying data. It performs less well when the chat messages are not questions about the data, or are very referential to previous context.
Extends
Implements
Constructors
new CondenseQuestionChatEngine()
new CondenseQuestionChatEngine(
init
):CondenseQuestionChatEngine
Parameters
• init
• init.chatHistory: ChatMessage
[]
• init.condenseMessagePrompt?
• init.queryEngine: QueryEngine
• init.serviceContext?: ServiceContext
Returns
Overrides
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:41
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:37
condenseMessagePrompt()
condenseMessagePrompt: (
__namedParameters
) =>string
Parameters
• __namedParameters
• __namedParameters.chatHistory: undefined
| string
= ""
• __namedParameters.question: undefined
| string
= ""
Returns
string
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:39
llm
llm:
LLM
<object
,object
>
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:38
queryEngine
queryEngine:
QueryEngine
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:36
Methods
_getPromptModules()
protected
_getPromptModules():Record
<string
,any
>
Returns
Record
<string
, any
>
Inherited from
PromptMixin
. _getPromptModules
Source
packages/llamaindex/src/prompts/Mixin.ts:82
_getPrompts()
protected
_getPrompts():object
Returns
object
condenseMessagePrompt()
condenseMessagePrompt: (
__namedParameters
) =>string
Parameters
• __namedParameters
• __namedParameters.chatHistory: undefined
| string
= ""
• __namedParameters.question: undefined
| string
= ""
Returns
string
Overrides
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:56
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Parameters
• promptsDict
• promptsDict.condenseMessagePrompt
Returns
void
Overrides
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:62
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
>>
Implementation of
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:83
chat(params)
chat(
params
):Promise
<EngineResponse
>
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>
Implementation of
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:86
condenseQuestion()
private
condenseQuestion(chatHistory
,question
):Promise
<CompletionResponse
>
Parameters
• chatHistory: ChatHistory
<object
>
• question: string
Returns
Promise
<CompletionResponse
>
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:70
getPrompts()
getPrompts():
PromptsDict
Returns all prompts from the mixin and its modules
Returns
PromptsDict
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:27
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Implementation of
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:123
updatePrompts()
updatePrompts(
promptsDict
):void
Updates the prompts in the mixin and its modules
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Source
packages/llamaindex/src/prompts/Mixin.ts:48
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Validates the prompt keys and module keys
Parameters
• promptsDict: PromptsDict
• moduleDict: ModuleDict
Returns
void