Paging Secretary McNamara— Please Retrieve Your Fallacy At The Lost And Found.

Recently, I was reviewing data on keyword searches conducted on Google regarding qualitative research.  Here’s what I learned – a lot of people don’t seem to understand what qualitative research is, how it differs from quantitative, and how it can add value.  This is consistent with my own recent experience.  Many long-standing clients who are experienced research professionals increasingly find themselves defending the necessity of qualitative to their colleagues in marketing, top management, finance and corporate purchasing.  They’re constantly forced to respond to remarks like, “what can this stuff tell me that the numbers can’t?”
Whenever I hear about this, I think about the McNamara Fallacy –  named for Robert McNamara, Secretary of Defense for John F. Kennedy and Lyndon Johnson, and architect of Johnson’s escalation of the Vietnam War.  This phenomenon was described by pollster Daniel Yankelovich – a man who made his living quantifying things – in the early 1970s.  It describes a progression of thinking that starts reasonably, and ends up in near total dysfunction.
Step 1: Measure what can be easily measured.  Nothing wrong here —we should be quantifying what we can.
Step 2: Conduct an analysis that is either based entirely what can be measured, or that assigns estimated or arbitrary values to those things that can’t be.  Nothing inherently wrong here either, as long as you remember the limitations of such an analysis.  But it’s also risky, as it can easily lead to…
Step 3: Decide that what you can’t easily measure is unimportant.  Yankelovich says, “This is blindness,”  and, it will take you right to…
Step 4: Conclude that what you can’t easily quantify doesn’t exist.  Yankelovich calls this “suicide”.
This fallacy is named for McNamara because of his approach in Vietnam, in which he decided that objective, quantitative measures – most notably body counts – would be a proxy for success.  McNamara’s focus on these metrics told him the US was winning the war.  But by 1968, it was clear that the numbers were telling a misleading story.
The lessons here:
  • Numbers provide essential information. However, by themselves, they only tell what can be quantified – that something can’t be quantified doesn’t mean it’s non-existent, or that it isn’t important.
  • The numbers themselves must be questioned. Had Bob McNamara taken a closer look at the body count figures he was receiving by using tools that dig beneath numbers to help them tell a more accurate and enriching story, he might have interpreted them very differently.   It’s important to remember that every data point represents something: a person, an event, a memory, a perception, and so on.  If we are to truly understand the numbers in aggregate, and make good decisions as a result of that understanding, it is imperative that we spend time looking at those things the numbers represent.  This is where qualitative tools –  such as conversation, observation and creative exercises – can add so much value.
  • It’s valuable to remind ourselves periodically why we use metrics in the first place. We do this to simplify—to make something understandable that might otherwise be too complex to grasp.  However, we must be very careful in our selection of metrics, as every measure contains assumptions about causality.  McNamara and his staff assumed – obviously incorrectly – that there was a causal link between success and killing more soldiers than then enemy.
Today, it seems we may have forgotten the cautionary tale of Robert McNamara.  With so much data available to us – and it’s often of such high quality –  we can forget the power of data lies in the stories it tells and the humanity it describes.  And it’s qualitative tools that help us find those stories and that humanity.  Often, we are presented with qualitative and quantitative as an either/or decision.  But this is a false choice.  The two must work together to uncover the truth.  And so we – as those who are responsible for interpreting data for the purpose of informing decisions – must always remember to dig deeply into that data and the assumptions that underlie it by creating research approaches that meld numbers and stories.

The Electrodes are Coming !

I recently attended NeuroU 2019, and It was a fascinating two days during which I immersed myself in the world of neuromarketing.  One key thing became abundantly clear; biometric data is about to become a thing, and marketers better get ready for it.  Here are two key takeaways I think you’ll find interesting.

Biometrics Can Provide a Valuable Augment to the Data We Already Collect

When we combine the types of data typically provided by marketing research—survey responses, syndicated data and qualitative learnings—and combine them with sources that measure physiological response to research stimuli, we can add valuable insight to our findings.  These physiological metrics can document respondent attention and engagement.

For example, we show a visual stimulus to qualitative research participants such as a print ad, a webpage, a package mockup or a retail shelf set.  In addition to discussing the stimulus, we could augment the findings with some eye tracking which would tell us what people actually looked at, when, and for how long.  “Hold on,’ you say, ‘eye tracking has been around for decades; what’s new and different about that?’ Now, we can also add in measures like heart rate, pupil dilation and galvanic skin response so we can determine which elements correlate with a physiological response.  This informs us as to what elements were actually engaging, and which ones merely elicited attention but no real interest.

Perhaps the stimuli are more dynamic – a TV commercial, or shoppers explore a retail environment.  We can now measure EEG response continuously during the exposure period, or gather facial coding data.  Both can provide significant insight into the nature of an individual’s emotional responses to a stimulus.  When we combine this information with traditional quantitative measures (such as recall and persuasion) and insights gathered during qualitative discussion, we can substantially increase our understanding of how consumers are responding to messages and environments.

The Hardware and Software Are Pretty Much Ready for Mainstream Usage

While biometric data was always interesting in theory, significant logistical challenges made it impractical for typical applications. The software was not user-friendly, the hardware clunky and temperamental, and costs usually prohibitive.  Over the past few years, suppliers have devoted significant resources to address these challenges, and now offer turnkey hardware and software suites that can provide reliable data at an extremely reasonable cost.

The upshot—it might be time to start dipping your toe into this end of the pool.  Used appropriately, biometric data has the potential to be a major problem-solver for researchers.

On the Value of Humility.

My grandfather once asked me, “who do you want to be—the guy who has all the answers, or the guy who has all the questions?”  When I said I wanted to be the guy with the answers, he laughed and said, “in that case, you’re never going to learn a damned thing.”  At the time, I doubt I understood his point.  And this wasn’t the last time somebody would try to make this point to me.
Before getting into qualitative, I spent several years in CPG brand management, working in several marketing groups over that period. Culturally, they all had one thing in common:: admitting not knowing something about your brand or consumer was very risky.  This was seen as evidence that you weren’t fully immersed in your business.  Most of the consumer research in which I was involved was highly confirmatory—we were simply looking to verify what we thought we knew to be true.  At one point, a moderator with whom we were working was admonished not to ask questions that were exploratory in nature, and just stick to the discussion guide he’d been given. He asked, “if you guys know so much about your consumer and what she wants, why are you doing research at all?”  The sarcasm in his tone was unmistakable.
Shortly after getting into qualitative, I was conversing with another moderator, the late Jan Beehner-Chandler.  She made one remark I’ve never forgotten … “you can’t tell some clients and researchers anything.  They think they already know everything, so they won’t listen, and so most of the things they could learn from research go right by them.”
Eventually I got the message. Thinking you have all the answers is antithetical to insightful research, because the most important element to uncovering new information is humility.  Without humility – the overt acknowledgement of one’s own shortcomings and ignorance –  there can be no curiosity, no ability to question and research and learn.  So, whenever I embark upon a new study I always begin from a place of humility.  I list out:
  • What I believe to be true beyond reasonable doubt
  • What might be true but I don’t know for sure
  • What I know that I don’t know
  • Some thoughts on what I don’t know that I don’t know.
I use this to inform my initial conversations with my client, and to develop research objectives, discussion guides and research stimuli.  And all of this comes from consciously acknowledging that there’s stuff I don’t know.

So You Ask: What’s a Hybrid?

Here’s a question I get asked a lot … “do I prefer doing online or face-to-face qualitative research?”  The answer I always give … “yes, I like them both”.  And here’s a closely-related question that I also often get … “what’s the right approach for a specific research study, online or in-person?”  And I’ll frequently give the same answer … “yes, let’s do both.”   For in fact, often the best approach is to use both.   In the past, our qualitative research toolbox was pretty limited. We had focus groups of various sizes, in-depth interviews (IDIs), and maybe some telephone interviews.  Now we have a dizzying array of tools available to us, and often the best way to get the most bang out of a research buck is to combine them.
The guiding principle here is that online (OL) and face-to-face (F2F) research tools have very different strengths and weaknesses.  So, by putting them together, we can create an approach that yields far more insight than any of its individual components can alone.
  • F2F research approaches offer a high level of engagement. I don’t care what anybody says, no online approach can offer the same level of deep, personal connection that in-person research can.  Having everybody in the same physical location also allows a high degree of flexibility.  Because of all this, F2F is where new and unexpected insights are most likely to come to light.
  • OL is highly time and travel efficient. It can also be a big problem solver for low incidence recruits, as it allows you to recruit out of a national sample.  What’s more, it’s a highly efficient way to get a lot of the preliminaries out of the way—introductions, basic attitudes and practices, etc., and it’s also a good way to test a large number of ingoing ideas or hypotheses very efficiently and discard the weaker ones.
You can approach ‘hybrid’ in two ways. Online can serve as a precursor to F2F research; by creating relationships with research participants online, you can set yourself up for maximum quality face to face interactions.  Because you’ve already established rapport with participants, you’re now positioned to have an especially candid, productive discussion. Conversely, the opposite can also true.  You can start a study with an in-person phase during which you develop some initial hypotheses, and then test those hypotheses in a variety of ways very efficiently online.
So, with all this said, the question we should always be asking ourselves … “how do we best combine OL and F2F tools to provide the richest, most insightful research?

On Making it Look Easy and Having Your ‘Also’

Here’s a fundamental dilemma qualitative researchers face: we want clients to perceive what we do as being of great value, and, at the same time, we need to make it look easy.  This is particularly important when conducting face-to-face research.  To establish rapport, it’s essential to create a nice, relaxed, ‘shmoozy’ vibe.  We certainly don’t want to look as if we’re nervous, or working hard.  Nothing kills the mood more quickly than that.
Unfortunately, while experienced, well-informed observers may understand that moderating is much harder than it looks, many do not.  And that’s not something that’s within our power to change.  So we’re confronted with a perceived commoditization of moderating skills.  This is unfortunate, because moderating skill is definitely not a commodity (it takes many years of training and experience to become proficient), but there you have it.
How do researchers rise above this?  Having fabulous moderating skills simply isn’t enough.  The best ones have an also – an additional area of expertise.  And, from where do those also’s come?  They come from formal education or a pre-qualitative career.  They come from hobbies and avocations, and from personal experiences and challenges.  We can say to current and potential clients something like, “I’m a great moderator and I’m also a licensed clinical psychologist.”  Or “I’m a fantastic interviewer and I’ve also written a book on co-creation techniques.”  Or “I also worked for many years as an ad copywriter.”  Personally, I have a number of also’s, including that I worked for many years in brand management, and that I’m a recognized expert on the application of analytical models to research design and interpretation.  In short, great qualitative researchers are deeply prepared for their work, and know things most people don’t.
A great Qually is a cultural and strategic interpreter who can tease out new information and tell clients exactly what it means.  And that comes from more than just moderating skill – it comes from those also’s.  So, if you’re a qualitative researcher, embrace your passions—they’ll make you better at your job.  And if you’re a user of qualitative, make a point of asking about those also’s —they’re an important part of the package.