Skip to main content
How to use AI follow-up questions

Learn what AI follow-up questions are, and how to use them in your research studies.

Updated over a week ago

What are AI follow-up questions?

AI generated follow-up questions allow you to use AI to automatically ask relevant follow-up questions to a block in your study, helping you clarify participant responses and dig further.

AI follow-up questions can be added to any Rate, Choose, Explore or Image Test block.

What will the follow-up questions focus on?

By default, the AI will ask follow-up questions that help you better understand the participants response, prompting them to provide more details in their answer.

Guide the AI follow-up questions with a prompt

If you'd like to guide the AI-generated follow-up questions in a particular direction then you can do that by providing an instruction to the AI in the What would you like your follow-ups to focus on? field:

Some examples of instructions you can use are:

Goal

Example prompt

Dig deeper if a specific thing (e.g. "shopping at Walmart") is mentioned.

If they mention that they visited Walmart, ask for specific details about what they bought

Understand why the participant selected a specific option in a Choose or Rate block.

Reasons why the participant selected that option

Dig deeper into an initial response when recalling a past experience.

Try to understand what the participant was trying to achieve

Gather the initial impressions and reactions of participants after showing them an image.

Focus on their initial impressions or reactions

Identify what information could be missing from an image you're showing the participant.

How might we make the design clearer

Tips for writing good prompts to guide the AI follow-ups

The Wondering AI-generated follow-ups are powered by large language models, which means there are specific prompt formats that are especially effective at generating guiding the AI to ask follow-up questions that you want to cover. Here are a number of tips for how to write prompts that we find help you control how the AI follow-ups are asked.

We've found the prompt formats in this article work well for guiding the AI in what questions should be asked, but you should feel free to explore prompts that aren't recommended in this guide.

Be clear, concise and direct

When writing a prompt to guide AI follow-up questions on Wondering, try to write clear and concise instructions. Think of Wondering like a research assistant that you're asking to ask questions for you, and who has no context on what you want except from what you explicitly tell them in your study blocks and this prompt. The less they have to guess about what you want, the more likely they'll ask follow-up questions in a format you're happy with.

Be specific, descriptive and as detailed as possible about your desired content, length, style etc

If you have specific requirements in mind for how the AI should ask follow-up questions, include those details in your prompt.

Ask yourself if a colleague would understand how to ask follow-up questions based on your prompt

When writing your prompts, ask yourself if a colleague could follow the instructions in your prompt and (with enough time) ask good follow-up questions that you are happy with. If your colleague is likely to be confused, so might the Wondering AI.

Avoid imprecise descriptions

The more precise you are in your descriptions, the better Wondering's AI can generate follow-up questions that match your descriptions.

Don't be afraid to test and iterate

The best way to test if a prompt gives you follow-up questions you're happy with is to test out how the follow-up questions are asked in the preview in the study builder. You can then iterating if there's something you'd like to change.

Previewing your AI generated follow-up questions

To make sure that follow-up questions get asked in the way you want, you can preview how your study will appear to your participants in the Questions preview on the right-hand side of the study builder, or using the Preview button in the top-right corner.

How many follow up questions will be asked by the AI?

When setting up a block you can pick the maximum number of follow-up questions using the Max number of AI follow-ups setting:

However, if the participant has already given a comprehensive answer then the study will move on automatically.

When should I add AI follow-up questions to my study block?

Follow-ups can be added to all blocks in the study builder. For more information on each block type check out this guide.

Add AI follow-ups when you want to gather more information about your participant's response than you think you would get from just the initial question in a block. Some typical use cases are:

  • Explore blocks - if you're asking participants to a question about their experiences, they'll sometimes only provide a high-level answer at first. AI generated follow-up questions allow you to ask clarifying questions that give you more information.

  • Rate or Choose blocks - when asking a rating or multi-choice question it's often helpful to understand why your participant gave the answer they gave. Adding an AI generated follow-up question here will allow you to automatically dig deeper and find out why they gave the answer they gave.

  • Show blocks - when asking participants to review a design it's often helpful to start with an open ended question to gather their unguided feedback. Then you can use AI generated follow-up questions to dig deeper.

Did this answer your question?