Read time: 5 mins
Online qualitative research gives us incredible access to real human stories. People open their lives, their routines, sometimes their grief or identity or pride. And now AI has entered that space too. Respondents are not hiding it. Many use it. Quietly. Strategically. Sometimes because the task feels like an emotional marathon they did not realize they signed up for.
Key Takeaways:
-
AI is not the problem—it’s a signal. When respondents use AI to manage emotional fatigue, it reveals where research design may be demanding too much.
-
Emotional depth requires intentional care. Without support, qualitative research can unintentionally push participants into distress rather than insight.
-
Fieldwork needs a voice early. Involving those closest to respondents before launch helps build transparency, recovery space, and safer emotional engagement.
Online qualitative research gives us incredible access to real human stories. People open their lives, their routines, sometimes their grief or identity or pride. And now AI has entered that space too. Respondents are not hiding it. Many use it. Quietly. Strategically. Sometimes because the task feels like an emotional marathon they did not realize they signed up for.
AI is not the problem. It is a signal.
Qual can go deep.
Our responsibility is to make sure people don’t break in the process.
Jay Thordarson
VP, Research Services
The Logit Group
“The challenge is that AI can hide when someone is struggling. It can smooth the edges. Fill silence. Make a tired or hurting person sound beautifully reflective.”
People turn to it when the question goes deeper than they have energy for that day. When vulnerability starts to feel like a requirement. When they want to sound composed instead of overwhelmed. When design begins to reward performance instead of honesty. And yes, sometimes simply because writing long emotional reflections after working, parenting or caregiving is a lot.
The challenge is that AI can hide when someone is struggling. It can smooth the edges. Fill silence. Make a tired or hurting person sound beautifully reflective. Someone can be unraveling privately and still appear articulate because a tool is doing the emotional lifting. That is not insight. That is survival.
And here is where fieldwork feels it first. Not because we see the tasks. In most cases, we do not. We know something is off only when a participant messages us and says I am trying but this is a lot or I did not expect to feel like this or I think I need to withdraw
In quant, we have the instrument and can flag issues before launch. In qual, we are supporting participants without the emotional roadmap. We hear strain only when someone reaches a breaking point. At that moment, fieldwork becomes emotional first aid instead of proactive care.
There is also a reality about representation we rarely name. Emotional labor is not evenly distributed. Some respondents walk in already carrying systems, expectations and private battles. Patients in survival mode. Caregivers running on fumes. People who have had to explain and defend their experiences far too often in real life. When research asks them to revisit those moments without warning or support, we are not collecting insight. We are asking the already-exhausted to dig deeper than they should have to.
And as AI becomes part of everyday respondent behavior, it will get harder to tell who is sharing real emotion and who is protecting themselves from emotional overload. It will become easier to sound vulnerable than to be vulnerable. That burden will fall on fieldwork unless we build transparency and collaboration up front. The teams closest to respondents are already doing emotional triage. Imagine what happens when they also get visibility, support and a seat at the table before the first prompt goes live.
So here is the ask. Let us build this together. Loop in the people closest to respondents before launch, not after distress emails arrive. Share the guide. Invite feedback. Design emotional depth with intention and recovery space. Remember that while AI can analyze feeling, it cannot feel the cost of expressing it. If the future of qual blends humanity and technology, then our responsibility is to keep the humanity intact.
Confession time:
Have you ever seen a respondent go emotionally deeper than the study needed Or watched a task accidentally turn into therapy homework? What did you do? What do you do now?
No judgment. We have all been there. The goal is not blame. The goal is a safer journey for the people who trust us with pieces of their lives for a few hours at a time.
About The Author
Jay Thordarson
VP, Research Services
Jay is an accomplished market research professional with extensive experience in global qualitative and face-to-face research.