Paging Secretary McNamara— Please Retrieve Your Fallacy At The Lost And Found.

Recently, I was reviewing data on keyword searches conducted on Google regarding qualitative research.  Here’s what I learned – a lot of people don’t seem to understand what qualitative research is, how it differs from quantitative, and how it can add value.  This is consistent with my own recent experience.  Many long-standing clients who are experienced research professionals increasingly find themselves defending the necessity of qualitative to their colleagues in marketing, top management, finance and corporate purchasing.  They’re constantly forced to respond to remarks like, “what can this stuff tell me that the numbers can’t?”
Whenever I hear about this, I think about the McNamara Fallacy –  named for Robert McNamara, Secretary of Defense for John F. Kennedy and Lyndon Johnson, and architect of Johnson’s escalation of the Vietnam War.  This phenomenon was described by pollster Daniel Yankelovich – a man who made his living quantifying things – in the early 1970s.  It describes a progression of thinking that starts reasonably, and ends up in near total dysfunction.
Step 1: Measure what can be easily measured.  Nothing wrong here —we should be quantifying what we can.
Step 2: Conduct an analysis that is either based entirely what can be measured, or that assigns estimated or arbitrary values to those things that can’t be.  Nothing inherently wrong here either, as long as you remember the limitations of such an analysis.  But it’s also risky, as it can easily lead to…
Step 3: Decide that what you can’t easily measure is unimportant.  Yankelovich says, “This is blindness,”  and, it will take you right to…
Step 4: Conclude that what you can’t easily quantify doesn’t exist.  Yankelovich calls this “suicide”.
This fallacy is named for McNamara because of his approach in Vietnam, in which he decided that objective, quantitative measures – most notably body counts – would be a proxy for success.  McNamara’s focus on these metrics told him the US was winning the war.  But by 1968, it was clear that the numbers were telling a misleading story.
The lessons here:
  • Numbers provide essential information. However, by themselves, they only tell what can be quantified – that something can’t be quantified doesn’t mean it’s non-existent, or that it isn’t important.
  • The numbers themselves must be questioned. Had Bob McNamara taken a closer look at the body count figures he was receiving by using tools that dig beneath numbers to help them tell a more accurate and enriching story, he might have interpreted them very differently.   It’s important to remember that every data point represents something: a person, an event, a memory, a perception, and so on.  If we are to truly understand the numbers in aggregate, and make good decisions as a result of that understanding, it is imperative that we spend time looking at those things the numbers represent.  This is where qualitative tools –  such as conversation, observation and creative exercises – can add so much value.
  • It’s valuable to remind ourselves periodically why we use metrics in the first place. We do this to simplify—to make something understandable that might otherwise be too complex to grasp.  However, we must be very careful in our selection of metrics, as every measure contains assumptions about causality.  McNamara and his staff assumed – obviously incorrectly – that there was a causal link between success and killing more soldiers than then enemy.
Today, it seems we may have forgotten the cautionary tale of Robert McNamara.  With so much data available to us – and it’s often of such high quality –  we can forget the power of data lies in the stories it tells and the humanity it describes.  And it’s qualitative tools that help us find those stories and that humanity.  Often, we are presented with qualitative and quantitative as an either/or decision.  But this is a false choice.  The two must work together to uncover the truth.  And so we – as those who are responsible for interpreting data for the purpose of informing decisions – must always remember to dig deeply into that data and the assumptions that underlie it by creating research approaches that meld numbers and stories.
Posted in Marketing History.