Category Archives: Marketing History
Webcam On!
I’ve had some discussions recently with clients and researchers about in-person qualitative research versus qualitative conducted via webcams. One point I hear made is that online conversations don’t yield the same level of engagement, intimacy and spontaneity as face-to-face ones. While this is a fair point, it’s often overstated—the difference isn’t as great as some might think. And one factor that gets overlooked is that webcam research allows you to do things that are difficult or impossible to do in person. There’s a pretty long list of tasks that work well online, and no reason to list them all here. But here are three of my favorite things that are particularly easy with webcams.
Geography
Sometimes, when designing in-person qualitative, I’ll try to discourage clients from conducting the research in too many locations. Not only is this time-consuming and expensive, but, if your qualitative will be followed by quantitative, you can get your geographic dispersion easily and cost-effectively in your quant phase. However, sometimes clients truly need a large number of markets represented in the qualitative data—this is particularly common in shopper insights studies. In situations like this, the webcam can really be your friend. You can recruit from a national (or even global) sample, and can include participants from a large number of markets in your study time- and cost-efficiently. And remember—you don’t have to do all your qualitative online. I’ve frequently conducted studies in which we began with one or two in-person markets, and finished the study online with a broader geographic sample.
Aggregated Data
When collecting reactions to stimuli (such as package designs, advertisements or new product concepts), it’s often helpful to be able to create a visual representation of all of the participants’ reactions. Online research platforms make this particularly easy to do, as many contain markup tools that allow participants to record their responses directly on the stimulus, with the back-end software aggregating all responses into a single visual presentation – such as a heat map – in real time. Yes, you can do this sort of thing with in-person research as well, but it’s often a slow, painstaking process.
Hybrid Qual/Quant Research
Most online qualitative platforms have built into them some type of polling question function. This allows you to quickly send a quick, closed ended question to your participants and capture everybody’s quantified response in real time. As with aggregation, while you can do this in person, it’s a lot easier and faster online. And here’s an even more exciting qual/quant wrinkle: there are online tools that will allow you to construct an agile (one- or two-day), fully integrated approach that starts with a large, geographically dispersed quantitative sample and ends with on-the-spot qualitative groups or interviews selected right out of your quant sample. So, at the end of one or two days, you’ll have statistically reliable quantitative data leavened with qualitative insights executed at blinding speed — an integrated study providing higher order insights that can replace time- and money-intensive multiphase research. This approach lends itself to all manner of strategic and tactical tasks: positioning, brand communication, concept development and testing, even segmentation.
These are just some of the unique capabilities webcams provide. I haven’t mentioned a number of others such as bringing a whole new level of feasibility and realism to ethnographic research, rapid textual analysis, or the ability to move participants easily among group discussions, individual interviews and bulletin boards. But I hope this was enough to pique your curiosity. One final thought: qualitative researchers often can be extroverts – sometimes extreme extroverts – who tend to forget that some research participants can actually be a bit uncomfortable with face-to-face interactions. That little bit of space you allow those people by conducting an interview via webcam can be just enough to make them more relaxed and forthcoming. So, in this time of social distancing, embrace the webcam. It can bring unique qualities to your research.
Sure You Can. But Should You?
I’ve had a lot of discussions recently with colleagues about ethics, some specifically regarding market research, some more broadly focused. We’re living in unusual times, and ethics have become a particularly fraught issue. Public health imperatives are going to require aggressive testing and contact tracing protocols among the population for the foreseeable future, and this will inevitably raise privacy concerns, particularly when you consider the technology-based solutions being proposed. We’ve also come face-to-face with such issues as the allocation of resources and the economic value of a human life. Market research is facing ethical challenges as well. Many clients are justifiably interested in understanding how the current public health crisis is influencing consumer attitudes and perceptions, as well as openness to new product and service ideas. However, this can raise difficult questions about how we identify, select, question and compensate the individuals participating in our research.
Most of the research I’ve done in the past three months has been healthcare related, and I’ve encountered situations in which some difficult ethical issues arose. Some of these involved how we recruited research participants, some related to the use of those participants’ personal information, and some were about the interpretation of research findings. In none of these situations was there a single, clear, correct answer. As often happens, we were working in those all-to-common grey areas.
I’m not the only one who’s preoccupied with ethics these days. The European Union and the State of California, to name just a couple, are also interested. Both have passed legislation intended to bring more ethical practices to the treatment of personal information, and more such regulation is sure to come.
As the philosopher Will Durant once pointed out, “we are what we repeatedly do.” His point was that excellence is habitual, but he could have just as easily been talking about ethics—the key to being ethical lies in habits. But what, on a practical level, does that mean? Many years ago, I learned from a very wise man, Rabbi Charles Kroloff, that ethics can be less bedeviling if you get in the routine of asking yourself a series of questions when contemplating a dubious course of action. Here they are.
How does this thing you’re thinking of doing square with accepted codes of conduct?
Is it legal? Is it allowed by your company’s policies? Is it consistent with your industry’s ethical standards? The market research world has no shortage of codes of conduct, and they’re worth reviewing periodically (I’ve included links to several at the end of this post). Personally, I’m a big fan of the golden rule – “do unto others as you would have them do unto you” —do you risk violating that? You might decide to proceed despite your intended action being against some rule or other, but it’s a good idea to devote a few minutes’ thought to whether this is wise.
How would you feel if your actions became widely known?
Would you be comfortable with that? Would you want your clients, or colleagues, or friends to know? How about your spouse, or kids, or siblings? What if your mother found out? If the idea of people whose opinions you value knowing what you’ve done makes you uneasy, that’s a pretty big red flag—proceed with caution. I’ve never accepted a finder’s fee for referring another supplier to one of my clients, because I wouldn’t be comfortable with my client knowing about it.
What if everybody did this?
Would the world be a better place, or would it be diminished? Dropping your used gloves on the asphalt isn’t a big deal if you’re the only one who does it, but if everyone coming out of the store follows suit, things will get messy in a hurry. Misleading research participants about the nature of the study in which they’re participating might not do too much damage if you only do it occasionally, but if we all start doing it all the time, the whole research world could blow up. If your contemplated action being emulated by everybody in your industry would cause problems, it’s probably not OK for you to do it either.
What are the potential consequences of my actions?
This is probably the most important question of all: what could happen if you do this thing? Do you risk prison? A fine? A lawsuit? Losing your job? Being run out of town on a rail? Dirty looks from people you don’t care about? Another great philosopher—Maimonides—once wrote … “a wise man is one who knows the consequences of his actions.” I think we often find ourselves in sticky situations because we didn’t take some time to think carefully about the possible endpoints where our actions might lead. Avoiding the conscious contemplation of outcomes is a very human thing to do—it’s uncomfortable, and often requires you to rethink your plans. That’s why this is a particularly valuable habit to develop. I’ve always suspected that the Enron disaster might have been averted if somebody had just said, ‘hey everybody, there’s no way this ends well.’
So, there you have it: four simple questions for worrisome situations. If you make them a habit, you’ll make ethical behavior habitual as well.
Links to various research industry codes of conduct:
https://www.qrca.org/page/ethics_practices
https://www.esomar.org/what-we-do/code-guidelines
https://www.insightsassociation.org/issues-policies/insights-association-code-standards-and-ethics-market-research-and-data-analytics-0
https://www.intellus.org/Standards-Guidelines/Code-of-Conduct
https://www.ama.org/codes-of-conduct/
Every Guy Has a Plan, Until He Gets Punched in the Mouth.
One of the 20th century’s great philosophers, Mike Tyson, said that. And while I definitely wouldn’t argue with Mike, I’m a big advocate of planning data analysis. If you know me, you’ve probably figured out that I don’t think analysis gets enough attention when it comes to qualitative research. We spend a lot of time planning how we will recruit and conduct research, but then take a seat-of-the-pants attitude towards analyzing our data. And this is a problem, because not planning your analysis can profoundly compromise the value of your research.
Qualitative analysis can be divided into five stages:
-
Planning
-
Debriefing
-
Consolidation
-
Unstructured musing
-
Structured analysis
I’ll provide my thinking on some of these analysis stages in future newsletters. I’ve already written a piece on consolidation but in this one I’m going to drill down on Planning.
An analysis plan should be a part of any comprehensive research design, qualitative or quantitative. In fact, the more effort you put into planning your analysis while you’re designing your research, the better your analysis will be. Planning has become even more important in recent years, as timelines get shorter, and as researchers are increasingly expected to analyze data from a variety of sources. With all that in mind, here are eight key best practices to follow when planning your analysis:
Consider research objectives and resultant decisions
Every study is (or, at least, should be) driven by business issues. Typically, clients have one or more decisions they will make on the basis of the research findings. Your design and analysis plan should directly address each of these questions and decisions. If it doesn’t show clearly how the data you gather will answer those questions and guide those decisions, something’s missing.
Set action standards
Action standards tell you what you need to see in your data in order to trigger a specific decision. They often take the form of thresholds for metrics like brand awareness, purchase intent, preference, and so forth. It’s crucial that action standards be established BEFORE the research—it’s not OK to eyeball the data after the fact and settle upon action standards then.
Provide structure to your data
Qualitative research is different from quantitative in that the data is, by its nature, less structured. Nevertheless, there are ways to provide some structure, and the more organization you can give to your data, the easier your analysis is going to be. Some ways to provide structure for in-person and webcam research include:
-
Written reaction exercises
-
Respondent worksheets and polling questions to capture reactions to stimuli
-
Respondent markups of research stimuli
-
Waiting room questionnaires
-
Note-taking worksheets or easel pads for observers and moderators
For online bulletin boards, structure is even more important, as they yield so much data (I conducted a board late last year that generated a 1,200 page transcript and nearly 700 images and videos). Some ways to organize that data include:
-
Tagging participants prior to the research on the basis of demographics, attitudes and behaviors (such as ‘three or more kids in home,’ ‘concerned with environmental sustainability,’ or ‘online category shopper.’)
-
Tagging responses as they come in, such as ‘concept positive,’ ‘brand negative,’ ‘concerned about cost,’ etc.
-
A.I. enabled textual analytics can be invaluable for surfacing trends from the data that you otherwise might never pick up.
Furthermore, how you divide up the sample for your research isn’t just important for group dynamics. Splitting up your groups or boards by demographics or behaviors can make your data better organized, as you’ll have more coherent and focused conversations.
Identify possible tools
It’s important to apply analytical models and frameworks to your data. Thinking in advance about what tools might be relevant could influence how you design and conduct your research. Exactly what tools you’ll end up using might change once you’ve gathered your data, but considering tools up front will give you the opportunity to plan conversations and exercises that will lead to fruitful analysis.
Focus your stimuli
The mark of a good research stimulus, be it a product concept, an advertising storyboard, a packaging prototype, whatever, is that it will generate readable responses. This means focus. Concepts should focus on a single benefit, story boards should focus on one clear selling proposition, and so forth. Focused stimuli will yield focused, easily analyzable responses.
Create a deliverable outline
I’m a big believer in drafting a final report outline while designing the study. This is a good way to set expectations as to what the deliverable will cover, and also provides a roadmap for your analysis plan. This outline represents the blanks you’re going to have to fill in, and knowing that in advance is a good way to make sure your research is focused on actionable findings, insights and recommendations.
Think in advance about the role of multiple data sources
The research studies I’m involved with increasingly generate data from a variety of disparate sources. These can include:
-
Conversation
-
Creative and projective techniques
-
Narratology
-
Textual analysis
-
Biometrics
-
Quantitative survey data
-
Syndicated data
-
Big data analytics