I want to tell you about a number that runs most SaaS companies.

It shows up in board decks. It gets brought up in every QBR. CS leaders get bonused on it. Product teams obsess over it. Entire departments exist to move it two points to the right.

And the peer-reviewed research says it is barely better than guessing randomly.

I’m talking about NPS.


Where it came from

In 2003 Fred Reichheld published “The One Number You Need to Grow” in the Harvard Business Review. He claimed that the “how likely are you to recommend us” question was the single most powerful predictor of company growth. That one question could replace every customer satisfaction survey you’ve ever run.

It was a compelling pitch. Simple. Clean. One number.

One problem. It was published in HBR. Which is a practitioner magazine. Not a peer-reviewed journal. The dataset was never made public. The methodology was never fully disclosed. Nobody could replicate it because nobody had access to the original work.

When actual researchers tested the claim things fell apart.


What the research actually says

In 2007 Keiningham and colleagues published a study in the Journal of Marketing testing Reichheld’s central claim. They used data from the American Customer Satisfaction Index and the Norwegian Customer Satisfaction Barometer.

The result: NPS was not a superior predictor of revenue growth. Other satisfaction metrics performed equally or better. The relationship between NPS and growth was in several industries not statistically different from zero.

Let that sink in. Not statistically different from zero.

Morgan and Rego at Indiana University went further. They analyzed NPS against multiple business outcomes across industries. NPS ranked among the worst predictors of future business performance. Worse than customer satisfaction scores. Worse than Top-2-Box measures. Worse than repurchase likelihood.

What the peer-reviewed research found
0
Statistical difference between NPS and growth in multiple industries
20-40%
Of customers who churned had given a Promoter score on their last survey

Even Rob Markey at Bain who helped create NPS alongside Reichheld has acknowledged that the score alone without an operational system around it is essentially meaningless. That is an incredible admission from one of the people who built the thing.


The happy customer who is about to leave

Here is what really gets me.

CustomerGauge published data from their B2B benchmarks showing that 20 to 40 percent of customers who churned had given a Promoter score of 9 or 10 in their most recent NPS survey.

They told you they loved you. Then they left.

This happens for reasons that are obvious once you think about them. The person filling out the survey has a personal relationship with the account manager. They don’t want to create conflict. They fear negative feedback might affect their service. The champion who fills out the survey is not representative of the ten other people at the company who actually use the product every day and are quietly frustrated.

NPS captures one person’s stated intention on one specific day. It does not capture how the broader account actually feels over time.


The 70% you never hear from

Response rates for B2B NPS surveys run between 10 and 30 percent. That means the score you are reporting to your board reflects the opinion of a minority. And not a random minority. A biased one.

The people who respond to surveys are systematically different from the people who don’t. Non-respondents tend to be less engaged. Less invested. More likely to be quietly evaluating alternatives. The exact population most at risk of churning is the population least likely to fill out your survey.

You are building your retention strategy on feedback from the customers who were already going to stay.

10-30%
Respond
70-90%
Silent. Less engaged. More likely to churn. The exact people you need to hear from.

What actually predicts churn

The signals that matter are the ones customers give you without thinking about it.

Product usage patterns. Login frequency. Feature adoption depth. These have predictive accuracy between 0.75 and 0.85 in published research. NPS sits between 0.55 and 0.65. Barely above chance.

But there is a signal layer that most companies are completely ignoring. Communication patterns.

How long it takes a customer to reply to your emails. Whether their messages are getting shorter. Whether the tone is shifting from warm to formal. Whether they stopped asking questions about features and started asking about contract terms.

Those patterns show up weeks or months before any survey would catch them. They show up across the entire account not just the one person who fills out a survey. And they happen in real time not once a quarter.

Companies tracking communication sentiment have reported 15 to 30 percent reductions in churn. Not because they found a better survey. Because they stopped relying on surveys altogether and started reading the signals customers were already sending every single day.


NPS is not useless. It is just not enough.

I am not saying throw it away entirely. As one directional signal among many it has some value. The problem is that most companies treat it as the primary indicator of customer health. Some treat it as the only one.

That is like checking the weather forecast once a month and calling it your climate strategy.

The real picture of how your customers feel lives in the communication that flows between you every day. The emails. The tone shifts. The engagement patterns. The things people say when they are not performing for a survey.

That data already exists. It is sitting in your inbox right now. The question is whether you are going to keep relying on one number from 2003 that was never peer-reviewed or start reading the signals that are actually there.


Sources: Keiningham et al., Journal of Marketing (2007). Morgan and Rego, Marketing Science (2006). East, Hammond and Lomax, Int’l Journal of Research in Marketing (2008). CustomerGauge B2B NPS Benchmarks (2018). Stahlkopf, Harvard Business Review (2019).