There’s an old anecdote that, at the height of the Vietnam War in 1969, an order was given that all the quantitative data the US military possessed should be fed into a supercomputer in the basement of the Pentagon. The computer was given everything – number of men, tanks, airplanes, fuel consumption, body count, average ammo expenditure – and was expected to calculate when the US would win war.
The computer responded that, given all available data, the USA won the war in 1965.
The culture of numbers is a seductive one. Numbers look meaningful, sound meaningful, appear to be meaningful, especially when you recast them into charts and grafts. Modern companies run on such data – the largest, such as Google, make their fortune on it. One of the great and early proponents of this culture of numbers was Robert McNamara (1916-2009). McNamara was a Havard MBA who spent WWII studying the efficiency of US bombers from a data perspective. Through his analysis of bombing logistics and adjustments to the supply chain, the USAF started saving billions of dollars – post-war McNamara took his skills and saved a spiraling Ford Motors, before taking a job as Secretary of Defence.
McNamara was a pioneer in the field of systems analysis, and his preaching the culture of numbers was preaching the culture of quantitative data. He brought it first to the USAF, then to Ford Motors, then to the Department of Defense, and then the US government at large. McNamara instituted unprecedented levels of data gathering: during the Vietnam War, the US command in Vietnam (MACV) generated tens of millions of pages of reports, charts, and graphs - quantitative information on everything one could conceive. McNamara’s goal was to gain a total understanding of the war through the numbers it generated – by understanding every number, the war could be measured, understood, managed, and predicted.
What McNamara's data did not show – what it could not show – were the opinions of the Vietnamese themselves. Their positioning could not be reduced to simple numbers – not that anyone at MACV considered asking them. Under the McNamara doctrine, opinions were futile when contrasted with hard data. They had the numbers, and the metrics, and the supreme understanding that quantitative information could bring – thus, they thought victory was assured.
Although McNamara left his posting in 1968, the data-driven leaders he had trained went on to guide the USA to the worst military defeat in its history.
The SaaS world today is prone to suffering from the same myopia as McNamara and MACV once did: a strong belief in the power of the quantitative, of a world that can be reduced to numbers. At Eigenworks, we sometimes meet resistance to our qualitative approach. “This seems really great,” potential clients say, “but we like data. Could you do something with more data?” (By which they mean quantitative data.) We could, of course, provide numbers – more numbers - but at some point, it stops being a meaningful service.
We’ve discussed this before in regard to churn metrics being unreliable, but quantitative information can only take you so far. It carries no inherent truth or value – it can be misinterpreted, misused, mistaken for something it’s not. Green Churn, for example, is all about how quantitative data can fail you; it’s more than just measuring the wrong metrics, it’s about the metrics that are not easily measured. We use the anecdote about how Box measures customer success: not by easily-measured metrics like number of users or the number of downloaded files, but whether Box has helped clients’ IT departments reduce on-site file servers and improve the clarity of their file management.
You can’t just plug this sort of information into a spreadsheet – it takes repeated communication with a buyer. It takes the effort of staying engaged and inquisitive about your buyer and your needs. Working with quantitative data is easier than working with qualitative data. When a manager breaths down sales’ necks demanding proof of progress, it’s a lot simpler to present a bunch of numbers than a bunch of words. ‘Here’s a graph that shows sales going up, here’s a chart of how we’ve met our targets.’ Numbers seem much more solid and objective than subjective input, but it doesn’t make them necessarily more accurate. The numbers that show sales increases do not show issues in onboarding, or that the product isn’t aligning with the customers’ main goals. You might have many new customers this quarter – but when even more churn next quarter, all those numbers proved inadequate in actually keeping you reliably informed.
Qualitative data is hard: it relies on things like judgment, trust, and critical thought. The number 10 is always the number 10, whereas an observation like ‘the customer did not seem as enthusiastic as we hoped’ has no such certainty. It requires trust in your interviewer or sales rep, trust in your customer, trust in your analyst, and trust in what you already know about your business – including hunches like ‘I feel’ and ‘it seemed.’ Qualitative data makes it harder to use an excuse – ‘the numbers made sense’ is more acceptable than ‘I thought I was right.' But just because it is harder to interpret does not mean it’s any less important. We’ve learned that putting your efforts in learning what your customer has to say pays off just as much as the hard numbers.
McNamara’s myopia was to believe that reducing a world to numbers made it manageable. Many SaaS management teams have the same thinking: if you reduce abstractions to quantifiable numbers, then abstractions can be managed. But this is misguided thinking. Such companies may be headed for their own Vietnam – a moment when it becomes abundantly clear that paying attention to only the numbers was a mistake of fatal proportions. Don’t be like McNamara and the MACV – start listening to the qualifiers in your world, to the input your customers share.
Otherwise you’re going to be defeated.