Class: SummaryChatHistory
A ChatHistory is used to keep the state of back and forth chat messages
Extends
Constructors
new SummaryChatHistory()
new SummaryChatHistory(
init
?):SummaryChatHistory
Parameters
• init?: Partial
<SummaryChatHistory
>
Returns
Overrides
Source
packages/llamaindex/src/ChatHistory.ts:80
Properties
llm
llm:
LLM
<object
,object
>
Source
packages/llamaindex/src/ChatHistory.ts:77
messages
messages:
ChatMessage
[]
Overrides
Source
packages/llamaindex/src/ChatHistory.ts:75
messagesBefore
private
messagesBefore:number
Source
packages/llamaindex/src/ChatHistory.ts:78
summaryPrompt()
summaryPrompt: (
__namedParameters
) =>string
Parameters
• __namedParameters
• __namedParameters.context: undefined
| string
= ""
Returns
string
Source
packages/llamaindex/src/ChatHistory.ts:76
tokenizer
tokenizer:
Tokenizer
Tokenizer function that converts text to tokens, this is used to calculate the number of tokens in a message.
Source
packages/llamaindex/src/ChatHistory.ts:73
tokensToSummarize
tokensToSummarize:
number
Source
packages/llamaindex/src/ChatHistory.ts:74
Accessors
nonSystemMessages
get
private
nonSystemMessages():ChatMessage
[]
Returns
Source
packages/llamaindex/src/ChatHistory.ts:155
systemMessages
get
private
systemMessages():ChatMessage
[]
Returns
Source
packages/llamaindex/src/ChatHistory.ts:150
Methods
addMessage()
addMessage(
message
):void
Adds a message to the chat history.
Parameters
• message: ChatMessage
Returns
void
Overrides
Source
packages/llamaindex/src/ChatHistory.ts:129
calcConversationMessages()
private
calcConversationMessages(transformSummary
?):ChatMessage
[]
Calculates the messages that describe the conversation so far. If there's no memory, all non-system messages are used. If there's a memory, uses all messages after the last summary message.
Parameters
• transformSummary?: boolean
Returns
Source
packages/llamaindex/src/ChatHistory.ts:165
calcCurrentRequestMessages()
private
calcCurrentRequestMessages(transientMessages
?):ChatMessage
[]
Parameters
• transientMessages?: ChatMessage
[]
Returns
Source
packages/llamaindex/src/ChatHistory.ts:183
getLastSummary()
getLastSummary():
null
|ChatMessage
Returns
null
| ChatMessage
Source
packages/llamaindex/src/ChatHistory.ts:145
getLastSummaryIndex()
private
getLastSummaryIndex():null
|number
Returns
null
| number
Source
packages/llamaindex/src/ChatHistory.ts:134
newMessages()
newMessages():
ChatMessage
[]
Returns the new messages since the last call to this function (or since calling the constructor)
Returns
Overrides
Source
packages/llamaindex/src/ChatHistory.ts:227
requestMessages()
requestMessages(
transientMessages
?):Promise
<ChatMessage
[]>
Returns the messages that should be used as input to the LLM.
Parameters
• transientMessages?: ChatMessage
[]
Returns
Promise
<ChatMessage
[]>
Overrides
Source
packages/llamaindex/src/ChatHistory.ts:193
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Overrides
Source
packages/llamaindex/src/ChatHistory.ts:223
summarize()
private
summarize():Promise
<ChatMessage
>
Returns
Promise
<ChatMessage
>