Adding some frontend sugar

We already have a ChatGPT-style interface for our PDF documents. Now, it's time to add some love to the frontend!

The column-reverse trick for our scroll container

Firstly, we want to make the UI resemble more of a chat interface. This means that the input field should be located at the bottom, and whenever a new message is added, the screen should automatically scroll down to show the latest message.

Many would reach for Javascript here, and do something like a scrollIntoView after each message DOM node is created. However, I have been around the block with chat UIs!

The best way to do this is with pure CSS, and it’s actually super simple.

All you need to do is add a reverse flex to your scroll container:

display: flex; flex-direction: column-reverse;

This will reverse the default scroll behaviour of the container, meaning it will anchor the scroll to the bottom of the container.

Place your actual contents in a non-reversed column container so its displayed normally, directly within the reversed scroll container.

<div class="overflow-y-auto flex flex-col-reverse">
	<div class="flex flex-col">
		<!-- your content -->  
	</div>
</div>

In our case, this looks like:

<div
  class="p-6 mt-14 h-full flex flex-col-reverse overflow-y-auto"
>
    <PartsChain />
</div>

And PartsChain contains our list of messages. Easy! No Javascript needed.

You can see here how their is an overflow of messages, but the scroll is anchored to the bottom:

Creating an optimistic UI

A very awkward UX with the current chat is that when you press enter to send a message, nothing immediately happens other than the loading state.

This is because the messages are added to the chat history (and therefore displayed) after we await the chain API. We do this because we need to feed the chat history into the API request ***before*** updating it.

To solve this, we will need to maintain two arrays:

  • one for v-for-ing over, where we can optimistically push the user’s message before hitting the chain API
  • one for sending to the chain, that pushes the messages to history after the chain has run instead

In a similar fashion, we have also cloned the user input so that we can clear it from the UI before sending the API request.

This now looks like:

/** Chat history to send to API */
const chatHistory = ref<RelevanceHistoryObject[]>([]);

/** This is what will render in our v-for loop - allows us to create optimistic UI*/
const uiChatHistory = ref<RelevanceHistoryObject[]>([]);

async function askQuestion() {
    try {
        error.value = null;
        isLoadingMessage.value = true;

        // clear question from UI optimistically
        const clonedUserInput = question.value;
        question.value = '';
				
				// push to ui history optimistically
        uiChatHistory.value.push({
            role: 'user',
            message: clonedUserInput
        });

        const payload: ChainApiPayload = {
            // do not pass empty history array
            params: { [CHAIN_QUESTION_PARAM_KEY]: clonedUserInput, [CHAIN_HISTORY_PARAM_KEY]: chatHistory.value?.length ? chatHistory.value : undefined },
            project: CHAIN_PROJECT,
        };

        const response: ChainApiResponse = await $fetch(CHAIN_URL, {
            method: 'POST',
            body: payload,
        });

        const responseAnswer = response.output[CHAIN_ANSWER_OUTPUT_KEY];

        const MAX_HISTORY_LENGTH = 20;

        // -1 because we are about to add 2 new items to history
        if (chatHistory.value.length > MAX_HISTORY_LENGTH - 1) {
            // remove first 2 items
            chatHistory.value.splice(0, 2);
        }
         
        chatHistory.value.push({
            role: 'user',
            message: clonedUserInput
        });

        chatHistory.value.push({
            role: 'ai',
            message: responseAnswer.answer,
        });

        uiChatHistory.value.push({
            role: 'ai',
            message: responseAnswer.answer,
            // store references for display in UI
            references: responseAnswer.references
        });
      
        // clear input
        question.value = '';
    } catch (e) {
        console.error(e);
        error.value = 'Unfortunately, there was an error asking your question.';
    } finally {
        isLoadingMessage.value = false;
    }
};

A pretty loading indicator

Finally, we’ve set up a “chat bubbles” loading indicator. We’ve done this with some Tailwind and a bit of in-line CSS. You could abstract these to classes, or use Tailwind’s JIT class generation instead of inline styles; but hey, this is easier to understand.

<!-- "Typing" loading indicator indicator -->
<span
    v-if="isLoadingMessage"
    class="p-1.5 rounded bg-gray-200 text-gray-400 flex items-center space-x-2 self-start"
>
    <div
        class="bg-current p-1 rounded-full animate-bounce"
        style="animation-delay: 0.1s"
    ></div>
    <div
        class="bg-current p-1 rounded-full animate-bounce"
        style="animation-delay: 0.2s"
    ></div>
    <div
        class="bg-current p-1 rounded-full animate-bounce"
        style="animation-delay: 0.3s"
    ></div>
</span>

We stagger the animations so they bounce sequentially, like a mini Mexican wave.

With all of this, we now have a beautiful chat interface!