Class: CondenseQuestionChatEngine
CondenseQuestionChatEngine is used in conjunction with a Index (for example VectorStoreIndex). It does two steps on taking a user's chat message: first, it condenses the chat message with the previous chat history into a question with more context. Then, it queries the underlying Index using the new question with context and returns the response. CondenseQuestionChatEngine performs well when the input is primarily questions about the underlying data. It performs less well when the chat messages are not questions about the data, or are very referential to previous context.
Extends
Implements
Constructors
new CondenseQuestionChatEngine()
new CondenseQuestionChatEngine(
init
):CondenseQuestionChatEngine
Parameters
• init
• init.chatHistory: ChatMessage
[]
• init.condenseMessagePrompt?
• init.queryEngine: QueryEngine
• init.serviceContext?: ServiceContext
Returns
Overrides
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:41
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:37
condenseMessagePrompt()
condenseMessagePrompt: (
__namedParameters
) =>string
Parameters
• __namedParameters
• __namedParameters.chatHistory: undefined
| string
= ""
• __namedParameters.question: undefined
| string
= ""
Returns
string
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:39
llm
llm:
LLM
<object
,object
>
Source
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:38