Using chat history to power a conversation
But what will make this really hum is chat history. This means that we can continue to talk to the chain even after it responds, just like ChatGPT.
In our chain, we handle this by passing a chat_history
parameter that looks like this:
interface RelevanceHistoryObject {
role: 'user' | 'ai';
message: string;
};
As mentioned earlier, the chain takes care of this and passes it to the LLM prompt, which generates a better question for our answer prompt.
Letβs update our askQuestion
logic to this:
/** This is what will render in our v-for loop */
const chatHistory = ref<RelevanceHistoryObject[]>([]);
async function askQuestion() {
try {
error.value = null;
isLoadingMessage.value = true;
const payload: ChainApiPayload = {
// do not pass empty history array
params: { [CHAIN_QUESTION_PARAM_KEY]: question.value, [CHAIN_HISTORY_PARAM_KEY]: chatHistory.value?.length ? chatHistory.value : undefined },
project: CHAIN_PROJECT,
};
const response: ChainApiResponse = await $fetch(CHAIN_URL, {
method: 'POST',
body: payload,
});
const responseAnswer = response.output[CHAIN_ANSWER_OUTPUT_KEY];
const MAX_HISTORY_LENGTH = 20;
// -1 because we are about to add 2 new items to history
if (chatHistory.value.length > MAX_HISTORY_LENGTH - 1) {
// remove first 2 items
chatHistory.value.splice(0, 2);
}
chatHistory.value.push({
role: 'user',
message: question.value
});
chatHistory.value.push({
role: 'ai',
message: responseAnswer
});
// clear input
question.value = '';
} catch (e) {
console.error(e);
error.value = 'Unfortunately, there was an error asking your question.';
} finally {
isLoadingMessage.value = false;
}
};
As you can see, we push our messages to the chat history array after asking a question. In the next request, the array will be passed into the chain.
To prevent exceeding the token limit on the LLM prompt, we limit the history length to 20 items.
Now in our template, we can v-for
loop over the chat history to show the messages.
<div class="w-full flex flex-col space-y-6">
<span
class="p-3 font-semibold text-sm rounded-lg flex flex-col"
:class="
chatItem.role === 'user'
? 'self-end ml-10 bg-gray-200 text-gray-800'
: 'self-start mr-10 bg-indigo-600 text-white'
"
v-for="(chatItem, idx) in uiChatHistory"
:key="idx"
>
{{ chatItem.message }}
</span>
</div>
We style the message at the start/end of the flex container and color it gray/indigo based on whether the role
is user
or ai
.
We have chat!
Updated 5 months ago