Surveys & Feedback

NPS survey: a complete guide

The question, the math, the benchmarks, and what to do with the answer.

9 min read Updated April 29, 2026

Net Promoter Score is one question, a small piece of math, and a lot of opinions about what to do with the answer. Used well, it tracks something real about your customers' loyalty over time. Used badly, it becomes a number that drifts loose from the business and ends up in a slide nobody trusts.

The question — and why the wording matters

The standard NPS question is: "How likely are you to recommend [product or company] to a friend or colleague?" answered on a 0–10 scale where 0 is "not at all likely" and 10 is "extremely likely." That phrasing is load-bearing. Substituting "would you recommend" or "would you buy again" produces a different metric — fine to track, but no longer comparable to NPS benchmarks.

Use it as a relationship survey (sent on a cadence to all customers) or a transactional survey (sent after a specific event). Most teams run both, with different follow-up questions, and never average them together.

The math

Bucket the 0–10 responses:

  • Promoters — scores of 9 and 10.
  • Passives — scores of 7 and 8. They count for nothing in the calculation.
  • Detractors — scores of 0 through 6.

NPS is the percentage of promoters minus the percentage of detractors. Passives drop out. The result is a number from -100 to +100. Anything positive means more promoters than detractors; anything above 30 is generally considered strong; anything above 50 is excellent in most categories. Software, retail, and B2B services all sit at different baselines, so compare yourself to your industry rather than the absolute number.

The follow-up question is where the value lives

The score alone is a thermometer. The follow-up question is the diagnosis. Always pair NPS with one open-ended "why?" question. Branching the follow-up by score range produces sharper data:

  1. To detractors: "What's the main reason for your score?" — this surfaces the actual problem.
  2. To passives: "What would have made it a 9 or 10?" — this surfaces the missing piece.
  3. To promoters: "What did you like most?" — this surfaces the message your marketing should be using.

Tag and theme the open-text answers monthly. The themes are more useful than the score itself, especially when the score moves and you need to explain why. For a comparison of NPS against the other two big customer-feedback metrics, see CSAT vs NPS vs CES.

Cadence and sampling

Send relationship NPS no more than twice a year per customer; quarterly is too often and trains your audience to stop responding. Transactional NPS — after a support ticket, an onboarding milestone, or a renewal — fires when the event happens, capped at one survey per customer per month so you don't burn the list.

Sample size matters. With fewer than a hundred responses per period, the score swings on the noise of a few extra detractors. If your customer base is small, lengthen the window before you compare quarter-over-quarter, or report a moving average. The customer feedback survey templates include NPS variants you can adapt.

Acting on the answers

NPS is worthless unless someone follows up. The minimum closed-loop process:

  • Detractor outreach — within forty-eight hours, a real person responds to detractor open text. Even a "thanks, we hear you, here is what we are doing" email turns some scores around the next quarter.
  • Theme review — monthly meeting with product and support to look at the top three detractor themes and decide what is being fixed, deprioritized, or accepted.
  • Promoter activation — promoters are your highest-yield source of referrals, reviews, and case studies. Route them into a referral or advocacy program.
  • Score reporting — report the score with the themes attached. A score with no narrative is a number waiting to be ignored.

The trap to avoid: turning NPS into a compensation metric without protecting the data collection. The moment a team's bonus depends on the score, the survey starts getting nudged — sent right after positive interactions, asked of curated audiences, or framed in ways that lift the number without lifting the underlying loyalty. Tie compensation to verbatim themes and customer behavior instead. Post-purchase survey best practices covers a related transactional pattern.

NPS in one paragraph: ask the standard 0–10 question, calculate promoters minus detractors, always pair with a "why?", route detractor responses to a human within forty-eight hours, theme the verbatims monthly, and treat the score as a trend rather than a target.

Frequently asked

What is a good NPS score?
The honest answer is "compared to what?" Software and SaaS averages tend to sit higher than retail or telecom averages, and the same product in different geographies returns different baselines. Look for the published industry benchmark from a reputable research firm and aim to beat your sector. As a rough rule, anything positive is okay, above 30 is strong, and above 50 is excellent in most categories.
How is NPS different from CSAT?
CSAT measures satisfaction with a specific interaction or product on a short scale, usually right after the event. NPS measures loyalty and likelihood to recommend, capturing the relationship rather than the moment. They answer different questions, and most mature feedback programs run both alongside Customer Effort Score.
Should I include "passives" in my analysis?
They drop out of the score calculation by design, but they are not dropped from your analysis. Passives are the most movable group — they liked you enough not to complain and not enough to recommend. Their open-text answers usually point at the smallest changes that would move the score the most.
How often should I send NPS?
Twice a year for relationship NPS is plenty for most businesses. Quarterly is acceptable if your churn is fast or your product changes frequently. Transactional NPS triggers off events and is capped at one survey per customer per month so you do not exhaust the list.
Can I show NPS publicly?
You can, but only if you are willing to keep showing it after a bad quarter. Companies that publish NPS on their site usually do it because the score is comfortably above their sector benchmark. If yours is, it is a credible signal; if it is not, it is a hostage to the next bad release.