Voice of customer research: how to build a VoC programme that actually influences decisions

Voice of customer research captures customers' expectations and experiences to drive product, marketing, and retention decisions. Learn how to design, collect, and analyse VoC data at scale.

Cover Image for Voice of customer research: how to build a VoC programme that actually influences decisions
Share this article:

Voice of customer (VoC) research is a structured approach to capturing what customers expect, experience, and value — and feeding those insights into decisions about products, services, pricing, and messaging. To do it well, you need a mix of data sources, a consistent method for analysing qualitative feedback, and a clear path from insight to action. Organisations that reach out to customers directly can reduce churn by double digits. According to Forrester, 84% of businesses that improved customer experience based on VoC data reported increased revenue.

Despite this, only 12% of CX professionals rate their VoC programme's maturity as high or very high, according to Forrester's research. Most programmes are collecting data — surveys, NPS, support tickets — without the analytical infrastructure to turn it into actionable insight. The volume is there; the synthesis is not.

This guide covers the four main VoC data sources, how to design a VoC interview for depth, how to analyse feedback at scale, and how to build a programme that product and commercial teams will actually use.

What are the 4 main types of VoC data?

Effective VoC programmes draw from multiple sources because each captures a different type of customer signal. No single source is sufficient on its own.

Data sourceWhat it capturesStrengthsLimitations
Customer interviewsMotivations, workarounds, unmet needs, emotional contextRich, contextual, exploratoryTime-intensive; requires skill to avoid leading
Structured surveys (NPS, CSAT, CES)Satisfaction scores, specific event ratingsFast and scalable; trackable over timeClosed-ended; doesn't explain the "why"
Support tickets and reviewsPain points, feature requests, failuresUnprompted; large volume; real languageNegatively biased; limited positive signal
AI-powered interviews (e.g. Skimle Ask)Contextual feedback at scale, without scheduling overheadCombines depth with breadth; 24/7; consistent questioningLess flexible than human-led; not suited for highly sensitive topics

The most useful VoC programmes use qualitative sources (interviews, AI interviews, open-text feedback) for depth and understanding, and quantitative sources (surveys, tickets, reviews) for volume, frequency, and trend-tracking. Neither alone gives you the full picture.

Why most VoC programmes fail to influence decisions

The problem is almost never data collection. Most organisations are swimming in customer feedback. The problem is synthesis — the ability to turn hundreds of individual comments into clear, confident insights that product managers and executives will act on.

Four patterns explain most VoC programme failures:

1. Over-reliance on NPS. Net Promoter Score is a useful health metric, but "our NPS is 42" tells you nothing about why customers are unhappy or what would make them more likely to stay. NPS should be a trigger for investigation, not the investigation itself.

2. Qualitative data sits in silos. Call recordings are in Gong, support tickets in Zendesk, interview notes in a shared drive, survey open-text in a spreadsheet. Nobody synthesises across these systematically because it is too time-consuming. This is the problem Skimle is built to solve: import the full set of transcripts, call notes, and open-text responses into one project, and automatic thematic analysis surfaces the patterns across all of them — with every insight traceable back to the source quote.

3. Research is episodic, not continuous. A quarterly customer research sprint produces insights that are already 3 months old by the time they reach the product team. Fast-moving businesses need a continuous signal, not periodic deep dives.

4. Insights don't travel. Research findings sit in a slide deck that gets presented once and then filed. Product teams, marketing, customer success — none of them have access to the underlying evidence. See our guide on building a research repository that people actually use for how to solve this.

How to design a VoC interview for depth

Surveys tell you what customers chose; interviews tell you why. A good VoC interview is not a satisfaction survey with open questions — it is a structured exploration of the customer's situation, goals, and experience.

The most effective VoC interview structure:

1. Context (5-10 minutes): Start with the customer's role, their day-to-day workflows, and their goals. "Walk me through how you typically handle [the problem your product solves]." This surfaces the context in which your product sits, and often reveals competing priorities and constraints you would not see from product usage data.

2. Current experience (10-15 minutes): "Tell me about a recent time you used [product or process] to do X. What prompted you to do it that way? What happened? What got in the way?" Specific recent events produce concrete, useful data. Generic questions ("what do you think about X?") produce vague, unreliable answers.

3. Unmet needs and workarounds (5-10 minutes): "What's the part of this that you wish worked differently? What do you do when it doesn't work the way you need?" Workarounds are the most revealing signal in VoC research — they show where customers have given up waiting for you to solve the problem and built their own solution.

4. Priorities and trade-offs (5 minutes): "If you could change one thing, what would it be? What would that be worth to you?" These questions connect the insight to business value.

For a deeper look at interview technique, see our guides on how to conduct effective business interviews and how to write a perfect interview guide.

How to collect VoC at scale

Individual interviews give you depth; scale gives you confidence. A mature VoC programme combines periodic deep-dive interviews with continuous lighter-weight signals.

Periodic interviews (quarterly or per-cohort): 15-25 interviews with customers across segments, tenure, and use patterns. These produce the deep understanding of motivations, workarounds, and unmet needs. Analyse themes across the full corpus, not just the most memorable quotes.

Continuous AI-powered interviews: Skimle Ask allows you to run structured AI-led conversations with customers triggered by product events (onboarding completion, upgrade, churn risk, support ticket resolution). Each conversation follows a consistent guide, and responses feed directly into analysis. This creates a continuous stream of qualitative data that surfaces emerging issues before they show up in NPS drops. See gathering rich data with AI interviews for more on how this works.

Passive feedback mining: App store reviews, G2 and Capterra reviews, support tickets, and community forum posts are all sources of unprompted customer language. Analysing these at scale requires a different approach — see how to analyse open-text responses at scale for the practical workflow.

For a detailed comparison of whether AI or human interviews are better for your VoC goals, see AI interviewing vs human interviewing.

How to analyse VoC data

The analysis framework that works best for VoC research is organised around what customers are trying to accomplish, not what they said about your product. For teams with more than a handful of interviews, doing this manually — reading through transcripts, building a theme taxonomy, cross-referencing quotes — quickly becomes the bottleneck that stalls the whole programme. Loading your interview transcripts and open-text responses into Skimle and running thematic analysis gives you a coded, searchable structure across the full corpus in hours rather than days, with the underlying quotes visible for every theme so you can verify and refine the findings.

A useful structure for coding VoC interviews:

Jobs — What is the customer trying to achieve? Functional jobs ("process invoices faster"), emotional jobs ("feel confident I'm not missing anything"), social jobs ("look effective to my team"). The job defines the context of value.

Outcomes — How does the customer know when the job is done well? What does success look and feel like? What does failure look and feel like?

Constraints — What gets in the way? Technical, organisational, skill-related, time-related.

Language — How do customers describe the problem in their own words? This is the raw material for positioning and messaging. The specific phrases customers use to describe their frustration or their success are more valuable than anything a copywriter invents.

Across a dataset of 20-30 interviews, patterns in the above dimensions become the substance of your VoC insight. Skimle's metadata analysis lets you compare these patterns across customer segments — by company size, tenure, industry, or role — so you can see whether the main job for enterprise customers is different from what SMB customers care about.

How to turn VoC into output that gets used

VoC research only adds value if it changes decisions. The two most useful outputs:

Customer insight brief (2-4 pages): A written synthesis of the main jobs, outcomes, and constraints across your research cohort. Concrete, specific, with direct quotes. Designed to be read in 10 minutes by a product manager or commercial leader. Not a slide deck — a document.

Job story repository: A structured collection of anonymised quotes and coded insights, organised by job and segment. This should be searchable and accessible to the full product and marketing team, not locked in a research drive. When a PM is writing a spec, they should be able to pull the 5 most relevant customer quotes in two minutes.

For academic and professional researchers doing consumer insights work, the qualitative consumer insights guide covers the full research workflow in more detail.

Frequently asked questions

How many customer interviews do you need for a VoC programme?

For initial discovery of the main customer jobs and pain points, 15-20 interviews across your main segments will surface the core patterns. For ongoing VoC, 8-10 interviews per cohort (segment × time period) is a reasonable cadence. The goal is to notice when something new appears, not to achieve statistical significance.

Is VoC research the same as user research?

They overlap but are not identical. User research (UX research) focuses on how people use a product — usability, task completion, mental models. VoC research is broader: it captures why customers chose you, what jobs they are trying to do, what they value, and what would make them leave. UX research informs design decisions; VoC research informs product strategy and commercial decisions.

What is the difference between VoC and NPS?

NPS measures one thing: the likelihood that a customer will recommend you. It is a lagging indicator of satisfaction. VoC research explains the drivers behind that score — why promoters are promoters, why detractors are detractors, and what would move customers from one category to the other. NPS without VoC qualitative data tells you there is a problem without telling you what it is.

Can VoC research replace market research?

They serve different purposes. VoC research focuses on existing customers — their experience, their jobs, their workarounds. Market research typically has a broader scope: understanding market size, non-customers, competitive positioning, and segment definition. For most B2B SaaS companies, VoC is more immediately actionable than market research. For companies entering new markets, traditional market research is also necessary.

How do you handle customer confidentiality in VoC research?

For B2B interviews, agree on what can and cannot be attributed before you start. Most customers are comfortable with anonymised aggregation ("several enterprise customers mentioned that...") but uncomfortable with direct attribution. Keep a clear record of consent. For AI-interview programmes like Skimle Ask, inform users how their responses will be used in the product flow before they begin.


Ready to collect and analyse customer feedback at scale with full traceability? Try Skimle for free — process interview transcripts, analyse Skimle Ask conversations, and slice findings by customer segment with metadata.

Related reading:


About the authors

Henri Schildt is a Professor of Strategy at Aalto University School of Business and co-founder of Skimle. He has published over a dozen peer-reviewed articles using qualitative methods, including work in Academy of Management Journal, Organisation Science, and Strategic Management Journal. His research focuses on organisational strategy, innovation, and qualitative methodology. Google Scholar profile

Olli Salo is a former Partner at McKinsey & Company where he spent 18 years helping clients understand the markets and themselves, develop winning strategies and improve their operating models. He has done over 1000 client interviews and published over 10 articles on McKinsey.com and beyond. LinkedIn profile


Sources