Paging Secretary McNamara— Please Retrieve Your Fallacy At The Lost And Found.

Recently, I was reviewing data on keyword searches conducted on Google regarding qualitative research.  Here’s what I learned – a lot of people don’t seem to understand what qualitative research is, how it differs from quantitative, and how it can add value.  This is consistent with my own recent experience.  Many long-standing clients who are experienced research professionals increasingly find themselves defending the necessity of qualitative to their colleagues in marketing, top management, finance and corporate purchasing.  They’re constantly forced to respond to remarks like, “what can this stuff tell me that the numbers can’t?”
Whenever I hear about this, I think about the McNamara Fallacy –  named for Robert McNamara, Secretary of Defense for John F. Kennedy and Lyndon Johnson, and architect of Johnson’s escalation of the Vietnam War.  This phenomenon was described by pollster Daniel Yankelovich – a man who made his living quantifying things – in the early 1970s.  It describes a progression of thinking that starts reasonably, and ends up in near total dysfunction.
Step 1: Measure what can be easily measured.  Nothing wrong here —we should be quantifying what we can.
Step 2: Conduct an analysis that is either based entirely what can be measured, or that assigns estimated or arbitrary values to those things that can’t be.  Nothing inherently wrong here either, as long as you remember the limitations of such an analysis.  But it’s also risky, as it can easily lead to…
Step 3: Decide that what you can’t easily measure is unimportant.  Yankelovich says, “This is blindness,”  and, it will take you right to…
Step 4: Conclude that what you can’t easily quantify doesn’t exist.  Yankelovich calls this “suicide”.
This fallacy is named for McNamara because of his approach in Vietnam, in which he decided that objective, quantitative measures – most notably body counts – would be a proxy for success.  McNamara’s focus on these metrics told him the US was winning the war.  But by 1968, it was clear that the numbers were telling a misleading story.
The lessons here:
  • Numbers provide essential information. However, by themselves, they only tell what can be quantified – that something can’t be quantified doesn’t mean it’s non-existent, or that it isn’t important.
  • The numbers themselves must be questioned. Had Bob McNamara taken a closer look at the body count figures he was receiving by using tools that dig beneath numbers to help them tell a more accurate and enriching story, he might have interpreted them very differently.   It’s important to remember that every data point represents something: a person, an event, a memory, a perception, and so on.  If we are to truly understand the numbers in aggregate, and make good decisions as a result of that understanding, it is imperative that we spend time looking at those things the numbers represent.  This is where qualitative tools –  such as conversation, observation and creative exercises – can add so much value.
  • It’s valuable to remind ourselves periodically why we use metrics in the first place. We do this to simplify—to make something understandable that might otherwise be too complex to grasp.  However, we must be very careful in our selection of metrics, as every measure contains assumptions about causality.  McNamara and his staff assumed – obviously incorrectly – that there was a causal link between success and killing more soldiers than then enemy.
Today, it seems we may have forgotten the cautionary tale of Robert McNamara.  With so much data available to us – and it’s often of such high quality –  we can forget the power of data lies in the stories it tells and the humanity it describes.  And it’s qualitative tools that help us find those stories and that humanity.  Often, we are presented with qualitative and quantitative as an either/or decision.  But this is a false choice.  The two must work together to uncover the truth.  And so we – as those who are responsible for interpreting data for the purpose of informing decisions – must always remember to dig deeply into that data and the assumptions that underlie it by creating research approaches that meld numbers and stories.

The Electrodes are Coming !

I recently attended NeuroU 2019, and It was a fascinating two days during which I immersed myself in the world of neuromarketing.  One key thing became abundantly clear; biometric data is about to become a thing, and marketers better get ready for it.  Here are two key takeaways I think you’ll find interesting.

Biometrics Can Provide a Valuable Augment to the Data We Already Collect

When we combine the types of data typically provided by marketing research—survey responses, syndicated data and qualitative learnings—and combine them with sources that measure physiological response to research stimuli, we can add valuable insight to our findings.  These physiological metrics can document respondent attention and engagement.

For example, we show a visual stimulus to qualitative research participants such as a print ad, a webpage, a package mockup or a retail shelf set.  In addition to discussing the stimulus, we could augment the findings with some eye tracking which would tell us what people actually looked at, when, and for how long.  “Hold on,’ you say, ‘eye tracking has been around for decades; what’s new and different about that?’ Now, we can also add in measures like heart rate, pupil dilation and galvanic skin response so we can determine which elements correlate with a physiological response.  This informs us as to what elements were actually engaging, and which ones merely elicited attention but no real interest.

Perhaps the stimuli are more dynamic – a TV commercial, or shoppers explore a retail environment.  We can now measure EEG response continuously during the exposure period, or gather facial coding data.  Both can provide significant insight into the nature of an individual’s emotional responses to a stimulus.  When we combine this information with traditional quantitative measures (such as recall and persuasion) and insights gathered during qualitative discussion, we can substantially increase our understanding of how consumers are responding to messages and environments.

The Hardware and Software Are Pretty Much Ready for Mainstream Usage

While biometric data was always interesting in theory, significant logistical challenges made it impractical for typical applications. The software was not user-friendly, the hardware clunky and temperamental, and costs usually prohibitive.  Over the past few years, suppliers have devoted significant resources to address these challenges, and now offer turnkey hardware and software suites that can provide reliable data at an extremely reasonable cost.

The upshot—it might be time to start dipping your toe into this end of the pool.  Used appropriately, biometric data has the potential to be a major problem-solver for researchers.