You've done the interviews. You have transcripts, session recordings, and sticky notes from a workshop that somehow turned into a forty-tab spreadsheet. Now you need to make sense of it all, preferably before next week's product review.
At this point, most UX researchers reach for one of three tools: Dovetail, Condens, or Skimle. They occupy similar shelf space in the research ops toolkit, but they are not interchangeable. Each was built with a different primary use case in mind, and choosing the wrong one will cost you time rather than save it.
This post is a comparison of all three. We will look at what each tool does well, where it falls short, and which situations call for which tool. If you already know the landscape and want a direct answer, skip ahead to the "Which tool is right for you?" section.
What these tools actually do
All three tools help qualitative researchers move from raw data (transcripts, notes, recordings) to structured insights. But the philosophies behind them differ considerably.
Dovetail started as a repository and tagging platform. Its strength is collaborative analysis: teams can tag quotes together, build shared repositories of research findings, and connect insights to product decisions. It has evolved significantly and now includes AI features, but its DNA is organisational. It is built for research operations at scale, where consistency of process matters as much as depth of any individual study.
Condens positions itself as a lightweight, fast tool for UX researchers who primarily work with video recordings and session notes. Its clip-highlighting workflow is genuinely excellent, and it is one of the easiest tools to get running on day one. The interface is clean and uncluttered. If your research practice revolves around usability testing and you need to pull compelling video highlights quickly, Condens earns its place.
Skimle takes a different approach. Rather than starting from tagging and collaboration, it starts from the analytical problem: how do you extract rigorous, well-structured insights from large volumes of qualitative data? It uses AI to surface themes across documents, supports structured object models for connecting findings across studies, and is designed for researchers who need to go deep rather than fast. You can read more about what Skimle is and how it works.
Where Dovetail leads
Dovetail's collaboration features are genuinely best-in-class. If you have a research team of five or more people tagging and reviewing data together, it handles that workflow more smoothly than either competitor. The ability to build a shared, searchable repository of past findings is also a real advantage for research ops professionals who need to surface previous work before commissioning new studies.
The tagging interface is intuitive and has been refined over many years. Researchers who are used to affinity mapping and sticky-note-style coding will find it familiar. Dovetail has also invested in integrations with tools like Figma, Jira, and Slack, which matters in larger product organisations where research needs to flow into other workflows.
Dovetail's AI features are improving, but they remain oriented towards summarisation and auto-tagging rather than deep thematic discovery. If your research question is "what did people say about feature X?" Dovetail handles it well. If your question is "what underlying patterns explain why users behave this way across different contexts?", the tool does not naturally take you there.
For teams doing continuous discovery, running many small studies and building institutional knowledge over time, Dovetail is a strong choice. Its repository approach rewards investment over time.
Where Condens leads
Condens is the most accessible entry point for researchers who are new to dedicated analysis tools. The onboarding is fast, the interface is clean, and the video-clipping workflow is excellent. If you are a solo researcher or working in a small team with limited budget and a focus on usability testing, Condens delivers genuine value without a steep learning curve.
The highlight reel feature is a particular strength. Being able to pull together a short video compilation of key moments from user sessions and share it with stakeholders is something that would take hours in a video editor, and Condens makes it straightforward. For teams where showing is more persuasive than telling, that matters.
Condens is less suited to large-scale document analysis or studies where you are working with text-heavy data (interview transcripts, survey responses, expert interviews). It is optimised for the visual and the immediate rather than for rigorous synthesis across many data sources.
Where Skimle leads
Skimle's advantage is depth of analytical rigour, particularly for interview analysis and other text-heavy qualitative work.
The core difference is structural. Most UX research tools treat analysis as a process of tagging and grouping quotes. Skimle treats it as a process of building a structured understanding of a domain. Its object model approach means that findings are not just tagged snippets but organised entities with relationships between them, which makes cross-study synthesis considerably more powerful.
The AI in Skimle is designed specifically for qualitative analysis rather than general summarisation. It surfaces themes with transparency: every theme and claim links back to the source material, so you can see exactly which quotes or passages support it. This matters for research credibility. If a stakeholder asks "how confident are you in this finding?", you can show your working. The transparency approach in Skimle's AI is deliberately different from black-box summarisation tools.
For researchers working with large datasets (say, forty or more interviews), the structured approach pays off significantly. Analysing interview transcripts at scale manually is time-consuming even with good tagging tools. Skimle handles the volume while preserving the analytical depth. You can also use metadata variables to segment themes by participant characteristics, which is something neither Dovetail nor Condens supports well natively.
For UX teams doing strategic research (generative studies, large-scale needs assessments, foundational research that informs roadmap decisions), Skimle offers more analytical firepower than either competitor.
A direct feature comparison
To make this concrete, here is how the three tools compare on the dimensions that matter most to UX researchers.
Depth of thematic analysis
Skimle is designed for this and does it most thoroughly. Dovetail supports it via tagging but requires more manual effort to get to well-structured themes. Condens is less suited to complex thematic work.
Collaboration and team workflows
Dovetail leads here. It was built for teams and the shared repository model is genuinely useful for larger research operations. Condens supports basic collaboration. Skimle supports multi-user work in the same document but is more oriented to the individual analyst's depth than the team's breadth.
Video and session recording
Condens leads, Dovetail is strong. Skimle is primarily a text-based analysis tool and is not the right choice if video highlights are a core output.
Ease of getting started
Condens is the quickest to get running. Dovetail is well-documented and onboards smoothly. Skimle has more conceptual depth and can be used across different workflows, but that also makes it slightly harder to learn at first.
AI quality and transparency
Skimle's AI is purpose-built for qualitative research and emphasises traceability. Dovetail and Condens both use AI for summarisation and auto-tagging, which is useful but less rigorous. For researchers who need to defend their analysis, Skimle's approach to AI transparency is a meaningful differentiator.
Integration with broader research workflows
Dovetail integrates with the most tools. Skimle supports flexible import and export workflows, including REFI-QDA. Condens has more limited integration options.
Scale
For studies with many documents or interviews, Skimle handles scale better. Dovetail is designed for many small studies over time. Condens works best for contained projects.
What these tools are not designed for
A few common misconceptions worth addressing.
None of these tools replace the researcher's judgment. They all assist with organisation, coding, and synthesis, but the analytical insight still comes from the person doing the research. Tools like ChatGPT might seem like a cheaper alternative for analysis, but they lack the structure and traceability that rigorous research requires.
Dovetail is not primarily a deep analysis tool. It is a research operations platform that happens to include analysis features. If your main need is rigorous thematic analysis rather than repository management, it may not be the right fit even if it is the most popular option on your team.
Skimle is not a UX testing tool. It does not support session recording, prototype testing, or the visual-first workflows that Condens and Dovetail accommodate. If video highlights are a primary deliverable, look elsewhere.
Condens is not suited to large-scale qualitative research. It is a focused tool that does a specific set of things well. Trying to use it for a 60-interview study with complex cross-cutting themes will be frustrating.
Which tool is right for you?
Here are the scenarios where each tool makes the most sense.
Choose Dovetail if:
- You are running a research team of five or more people who need to collaborate on tagging and analysis
- You are building a research repository that stakeholders across the organisation will access
- You need strong integrations with Jira, Figma, Slack, or similar tools
- Your practice is built around continuous discovery with many smaller studies
Choose Condens if:
- Your research is primarily usability testing with session recordings
- You need to produce video highlights for stakeholder presentations
- You are a solo researcher or small team with a modest budget
- Ease of use and quick setup are higher priorities than analytical depth
Choose Skimle if:
- You are conducting generative or strategic research with large volumes of interview data
- You need rigorous thematic analysis with full traceability from themes back to source material
- You are working with market research, expert interviews, or other text-heavy studies at scale
- AI transparency and research credibility are important (for academic, consulting, or policy contexts)
- You want to segment your analysis by participant metadata or combine AI-assisted analysis with manual coding
It is also worth noting that these tools are not mutually exclusive. Some research teams use Dovetail as their repository and Skimle for deeper analytical work on specific projects. Others use Condens for usability testing and Skimle for strategic research. The right combination depends on the range of work your team does.
On AI quality and research credibility
One area where the tools diverge meaningfully is how they handle AI-generated analysis.
All three tools use AI. The question is what the AI is doing and whether you can trust the output.
Summarisation AI (the kind Dovetail and Condens primarily use) is useful for reducing the time spent reading. But summarisation is not analysis. It can compress what people said without telling you what it means, why it matters, or what patterns run across your full dataset.
Skimle's approach is different because it is structured around the analytical problem from the ground up. The qualitative data analysis tools comparison published earlier goes into more detail on the distinctions, but the short version is that AI purpose-built for thematic discovery does different things than AI just built for text summarisation.
For UX researchers, the practical implication is this: if you are going to use AI in your analysis, understand what it is actually doing. If the tool cannot show you exactly which data supports each finding, you should be cautious about presenting that finding with confidence. Research credibility depends on auditability. The democratisation of qualitative insights is only genuinely useful if the insights are sound.
A note on switching costs
Whichever tool you choose, switching later has costs. Research repositories accumulate value over time, and moving data between platforms is rarely seamless. It is worth being deliberate about the initial choice rather than defaulting to whatever your team currently uses or whatever is most familiar from a previous role.
The questions that matter most for the long-term choice are: What kind of research do we primarily do? Is it usability-focused or generative? Do we need deep synthesis or fast turnaround? Is AI transparency important to our stakeholders? What does our team look like in two years?
Getting clear on those questions makes the tool choice considerably easier.
About the author
Olli Salo is a former Partner at McKinsey & Company where he spent 18 years helping clients understand the markets and themselves, develop winning strategies and improve their operating models. He has done over 1000 client interviews and published over 10 articles on McKinsey.com and beyond. LinkedIn profile
Frequently asked questions
How do I decide between a tagging-based tool and a theme-discovery tool for UX research?
The distinction comes down to what you need at the end of analysis. Tagging tools (like Dovetail) are excellent when you want to organise and retrieve what participants said about specific topics. Theme-discovery tools (like Skimle) are more suited to understanding why patterns exist and how themes relate to each other across a dataset. If your research questions are mostly "what did users say about X?", a tagging tool may be sufficient. If they are "what underlying needs or mental models explain user behaviour?", a more structured analytical approach will serve you better.
How do I analyse a large set of UX research interviews without losing rigour?
The key is having a consistent analytical framework before you start, not just tagging quotes as you read them. Define your analytical objects (the concepts you are looking for) in advance, use them consistently across all interviews, and treat each pass through the data as an opportunity to refine your framework rather than just add more tags. AI tools can speed this up considerably, but the framework still needs to come from the researcher. Our guide on how to analyse interview transcripts walks through this in detail.
How do I make AI-assisted analysis credible to sceptical stakeholders?
Traceability is the answer. Any AI-generated finding should link directly back to the source quotes or passages that support it. If you cannot show stakeholders the evidence trail from claim to data, they are right to be sceptical. Choose tools that make this trail visible and keep it intact through the reporting process. The two-way transparency approach that Skimle is built around addresses this directly: every insight is auditable.
How do I choose the right sample size for qualitative UX research?
Sample size in qualitative research is driven by the concept of theoretical saturation: you have enough data when new interviews stop introducing new themes. In practice, this is between 12 and 30 interviews for most focused UX research questions, though generative studies can warrant more. The specifics depend on the diversity of your participant pool and the complexity of your research question. Our qualitative research sample size guide covers this in more detail, including how to think about saturation in practice.
How do I combine AI-assisted analysis with my existing manual coding workflow?
Most researchers find a hybrid approach works well. Use AI to do an initial pass across all your data, identifying candidate themes and surfacing relevant passages. Then review and refine that structure manually, applying your own judgement to what the AI has surfaced. For researchers who want to move outputs into NVivo or MAXQDA, Skimle supports REFI-QDA export, which means you are not locked into a single tool. The goal is to use AI where it saves time (processing volume) while keeping human judgement where it matters most (interpreting meaning).
