Sure You Can. But Should You?

I’ve had a lot of discussions recently with colleagues about ethics, some specifically regarding market research, some more broadly focused. We’re living in unusual times, and ethics have become a particularly fraught issue. Public health imperatives are going to require aggressive testing and contact tracing protocols among the population for the foreseeable future, and this will inevitably raise privacy concerns, particularly when you consider the technology-based solutions being proposed. We’ve also come face-to-face with such issues as the allocation of resources and the economic value of a human life. Market research is facing ethical challenges as well. Many clients are justifiably interested in understanding how the current public health crisis is influencing consumer attitudes and perceptions, as well as openness to new product and service ideas. However, this can raise difficult questions about how we identify, select, question and compensate the individuals participating in our research.
Most of the research I’ve done in the past three months has been healthcare related, and I’ve encountered situations in which some difficult ethical issues arose. Some of these involved how we recruited research participants, some related to the use of those participants’ personal information, and some were about the interpretation of research findings. In none of these situations was there a single, clear, correct answer. As often happens, we were working in those all-to-common grey areas.
I’m not the only one who’s preoccupied with ethics these days. The European Union and the State of California, to name just a couple, are also interested. Both have passed legislation intended to bring more ethical practices to the treatment of personal information, and more such regulation is sure to come.
As the philosopher Will Durant once pointed out, “we are what we repeatedly do.” His point was that excellence is habitual, but he could have just as easily been talking about ethics—the key to being ethical lies in habits. But what, on a practical level, does that mean? Many years ago, I learned from a very wise man, Rabbi Charles Kroloff, that ethics can be less bedeviling if you get in the routine of asking yourself a series of questions when contemplating a dubious course of action. Here they are.
How does this thing you’re thinking of doing square with accepted codes of conduct?
Is it legal? Is it allowed by your company’s policies?  Is it consistent with your industry’s ethical standards? The market research world has no shortage of codes of conduct, and they’re worth reviewing periodically (I’ve included links to several at the end of this post). Personally, I’m a big fan of the golden rule – “do unto others as you would have them do unto you” —do you risk violating that? You might decide to proceed despite your intended action being against some rule or other, but it’s a good idea to devote a few minutes’ thought to whether this is wise.
How would you feel if your actions became widely known?
Would you be comfortable with that? Would you want your clients, or colleagues, or friends to know? How about your spouse, or kids, or siblings? What if your mother found out? If the idea of people whose opinions you value knowing what you’ve done makes you uneasy, that’s a pretty big red flag—proceed with caution. I’ve never accepted a finder’s fee for referring another supplier to one of my clients, because I wouldn’t be comfortable with my client knowing about it.
What if everybody did this? 
Would the world be a better place, or would it be diminished? Dropping your used gloves on the asphalt isn’t a big deal if you’re the only one who does it, but if everyone coming out of the store follows suit, things will get messy in a hurry. Misleading research participants about the nature of the study in which they’re participating might not do too much damage if you only do it occasionally, but if we all start doing it all the time, the whole research world could blow up. If your contemplated action being emulated by everybody in your industry would cause problems, it’s probably not OK for you to do it either.
What are the potential consequences of my actions? 
This is probably the most important question of all: what could happen if you do this thing? Do you risk prison? A fine? A lawsuit? Losing your job? Being run out of town on a rail? Dirty looks from people you don’t care about? Another great philosopher—Maimonides—once wrote … “a wise man is one who knows the consequences of his actions.” I think we often find ourselves in sticky situations because we didn’t take some time to think carefully about the possible endpoints where our actions might lead. Avoiding the conscious contemplation of outcomes is a very human thing to do—it’s uncomfortable, and often requires you to rethink your plans. That’s why this is a particularly valuable habit to develop. I’ve always suspected that the Enron disaster might have been averted if somebody had just said, ‘hey everybody, there’s no way this ends well.’
So, there you have it: four simple questions for worrisome situations.  If you make them a habit, you’ll make ethical behavior habitual as well.
Links to various research industry codes of conduct:
https://www.qrca.org/page/ethics_practices
https://www.esomar.org/what-we-do/code-guidelines
https://www.insightsassociation.org/issues-policies/insights-association-code-standards-and-ethics-market-research-and-data-analytics-0
https://www.intellus.org/Standards-Guidelines/Code-of-Conduct
https://www.ama.org/codes-of-conduct/

Every Guy Has a Plan, Until He Gets Punched in the Mouth.

One of the 20th century’s great philosophers, Mike Tyson, said that. And while I definitely wouldn’t argue with Mike, I’m a big advocate of planning data analysis. If you know me, you’ve probably figured out that I don’t think analysis gets enough attention when it comes to qualitative research. We spend a lot of time planning how we will recruit and conduct research, but then take a seat-of-the-pants attitude towards analyzing our data. And this is a problem, because not planning your analysis can profoundly compromise the value of your research.
Qualitative analysis can be divided into five stages:
  • Planning
  • Debriefing
  • Consolidation
  • Unstructured musing
  • Structured analysis
I’ll provide my thinking on some of these analysis stages in future newsletters. I’ve already written a piece on consolidation but in this one I’m going to drill down on Planning.
An analysis plan should be a part of any comprehensive research design, qualitative or quantitative. In fact, the more effort you put into planning your analysis while you’re designing your research, the better your analysis will be. Planning has become even more important in recent years, as timelines get shorter, and as researchers are increasingly expected to analyze data from a variety of sources. With all that in mind, here are eight key best practices to follow when planning your analysis:
Consider research objectives and resultant decisions
Every study is (or, at least, should be) driven by business issues. Typically, clients have one or more decisions they will make on the basis of the research findings. Your design and analysis plan should directly address each of these questions and decisions. If it doesn’t show clearly how the data you gather will answer those questions and guide those decisions, something’s missing.
Set action standards
Action standards tell you what you need to see in your data in order to trigger a specific decision. They often take the form of thresholds for metrics like brand awareness, purchase intent, preference, and so forth. It’s crucial that action standards be established BEFORE the research—it’s not OK to eyeball the data after the fact and settle upon action standards then.
Provide structure to your data
Qualitative research is different from quantitative in that the data is, by its nature, less structured. Nevertheless, there are ways to provide some structure, and the more organization you can give to your data, the easier your analysis is going to be. Some ways to provide structure for in-person and webcam research include:
  • Written reaction exercises
  • Respondent worksheets and polling questions to capture reactions to stimuli
  • Respondent markups of research stimuli
  • Waiting room questionnaires
  • Note-taking worksheets or easel pads for observers and moderators
For online bulletin boards, structure is even more important, as they yield so much data (I conducted a board late last year that generated a 1,200 page transcript and nearly 700 images and videos). Some ways to organize that data include:
  • Tagging participants prior to the research on the basis of demographics, attitudes and behaviors (such as ‘three or more kids in home,’ ‘concerned with environmental sustainability,’ or ‘online category shopper.’)
  • Tagging responses as they come in, such as ‘concept positive,’ ‘brand negative,’ ‘concerned about cost,’ etc.
  • A.I. enabled textual analytics can be invaluable for surfacing trends from the data that you otherwise might never pick up.
Furthermore, how you divide up the sample for your research isn’t just important for group dynamics. Splitting up your groups or boards by demographics or behaviors can make your data better organized, as you’ll have more coherent and focused conversations.
Identify possible tools
It’s important to apply analytical models and frameworks to your data. Thinking in advance about what tools might be relevant could influence how you design and conduct your research. Exactly what tools you’ll end up using might change once you’ve gathered your data, but considering tools up front will give you the opportunity to plan conversations and exercises that will lead to fruitful analysis.
Focus your stimuli
The mark of a good research stimulus, be it a product concept, an advertising storyboard, a packaging prototype, whatever, is that it will generate readable responses. This means focus. Concepts should focus on a single benefit, story boards should focus on one clear selling proposition, and so forth.  Focused stimuli will yield focused, easily analyzable responses. 
Create a deliverable outline
I’m a big believer in drafting a final report outline while designing the study. This is a good way to set expectations as to what the deliverable will cover, and also provides a roadmap for your analysis plan. This outline represents the blanks you’re going to have to fill in, and knowing that in advance is a good way to make sure your research is focused on actionable findings, insights and recommendations.
Think in advance about the role of multiple data sources
The research studies I’m involved with increasingly generate data from a variety of disparate sources.  These can include:
  • Conversation
  • Creative and projective techniques
  • Narratology
  • Textual analysis
  • Biometrics
  • Quantitative survey data
  • Syndicated data
  • Big data analytics
If you’re going to successfully weave a story together from these sources you must plan for that. This means you’ll need to think about all of the research objectives and resultant decisions, and plan out which of them will be addressed by which data source.
Prepare to be flexible
Dwight Eisenhower once wrote, “I have always found that plans are useless, but planning is indispensable.” Ike’s point was that, while plans nearly always change in the face of new information, the thinking that went into them is still useful. All that creation of structure and identification of tools, etc. still counts, even if things don’t pan out exactly as expected.
As they say in the military, if you fail to plan, you plan to fail. Carefully planning your analysis before you conduct your research will increase the chances of uncovering that brand-redefining insight, and reduce the possibility of a failed study.