Setting up a dummy Q&A chain

Before we dive into our main Q&A chain, let’s set up a simple one to see how it works and start to build out some frontend logic.

I’m going to jump into my Relevance AI notebook and create a demo chain:

It comprises of two simple pieces: on the right we set up a an input param for the “question”, and on the right we set up a single LLM chain step.

Note the {{ variable syntax. That means the question input param will be injected into the prompt.

Let’s set up an output for this chain called answer, that returns the answer variable from our LLM step. This essentially means our deployed chain API will return the LLM answer at output.answer .

To run this, you'll need to grab your OpenAI API key and add it to the API Keys section in the sidebar. Additionally, add your Redis connection string while you're there, as we'll need it later on. If you don’t, make sure to come back to this step and add it before running the final chain we create!

Head over to the “Deploy” tab in the top navigation, and click the big blue Deploy button. Now you have a deployed API for this chain!

Let's use this chain in our frontend to send questions and receive answers from OpenAI!

We’ll update the script tag in the Chain.vue to handle this new API logic. Eventually, we can refactor this out into a composable but let’s test this out.

const error = ref<string | null>(null);
const isLoadingMessage = ref<boolean>(false);

/** Retrieve from the "Deploy" page of your chain notebook */
const CHAIN_URL =

/** What key in your chain's output should be considered the answer */
const ANSWER_OUTPUT_KEY = 'answer';

/** What param key does your chain expect as the user's question */
const QUESTION_PARAM_KEY = 'question';

interface ChainApiPayload {
	project: string;
	params: Record<string, any>;
	version?: string;

interface ChainApiResponse {
	status: 'complete' | 'failed';
	errors: Record<'body', string>[];
	output: Record<string, any>;
	executionTime: number;

const question = ref<string>('');
const answer = ref<string | null>(null);

And now we’ll update our askQuestion function to hit our chain API with the details provided in the deploy page:

async function askQuestion() {
	try {
		error.value = null;
		isLoadingMessage.value = true;
		const payload: ChainApiPayload = {
			params: { [QUESTION_PARAM_KEY]: question.value },
			project: '8d4274a70a43-4f8e-b88a-23c927b58c1d',
		const response: ChainApiResponse = await $fetch(CHAIN_URL, {
			method: 'POST',
			body: payload,
		const responseAnswer = response.output[ANSWER_OUTPUT_KEY];
		answer.value = responseAnswer;
	} catch (e) {
		error.value = 'Unfortunately, there was an error asking your question.';
	} finally {
		isLoadingMessage.value = false;

Note that we are passing question.value into the payload, which is connected to our input.

With all of this in place, our basic little Q&A UI now actually does what it advertises!