Customer surveys are a powerful tool. When conducted properly, they give you a better understanding of what drives your prospects’ and customers’ behavior, and you can use that information to improve virtually every aspect of your business. But surveys are difficult to pull off. So we asked Kevin Jenné, a research director for ChannelHarvest Research and Aartrijk, for advice. Before his current gigs, Jenné led research and insights teams at Liberty Mutual/Safeco Insurance and Washington Mutual Bank. He also holds the Insights Professional Certification (master level) from the Insights Association. Let’s dive in.
Connecticut Innovations: Great to talk with you, Kevin. Let’s clear something up: Market research is not a marketing discipline, correct?
Kevin Jenné: Correct. It’s more of a social science discipline. It has a lot more in common with behavioral psychology or sociology, and that’s the kind of academic background we find in a lot of research professionals. Whether we’re talking about qualitative or quantitative research, practitioners approach the work from a scientific perspective, very often developing hypotheses, generating data to support or refute those hypotheses, and drawing conclusions (and, we hope, insights) that enable businesses to make better decisions. When someone says to me, “Oh, you’re in marketing,” l like to say, “No, we’re the boring number crunchers. The marketing people are the cool kids, but they do invite us to their parties.” They invite us because when we’re good at what we do, we provide factual, supportable findings, conclusions and recommendations for how our stakeholders can succeed better. And we do have many stakeholders beyond marketing. These include product development, operations, distribution planning, customer service, strategy leaders, human resources—really, anyone who needs to understand how people think and make choices.
CI: You’ve said that a great survey starts with the end in mind. Can you explain?
KJ: Very often we’re asked, “Can you help us run a survey?” That’s really step three. Step one is to understand what decision you need to make. Without knowing what you want to do with what you learn, it’s very easy to run a survey, look at the results and ask yourself, “Okay, now what do we do?” So, it’s super important to know what you’re trying to accomplish with any kind of research—whether that’s choosing among products to develop, figuring out what your most valuable customers want, deciding how to price your offering, picking the best ad copy, or whatever. Then, you need to determine what information you need to make that decision. Often your customers or potential prospects have that information, and that’s where research comes in. Then, it’s figuring out how you get the information you need.
CI: How do you go about market research? Are there different types?
KJ: It might be best to have open-ended conversations with a variety of people—generally at least 10 and more likely 20. Set aside your beliefs and biases and really listen to what people have to say. Don’t try to analyze along the way—keep good notes, keep listening and keep an open mind. When you’re done listening, go back over what you heard and look for the patterns—see what several people said, not just the one-off comments that caught your attention.
So that’s qualitative. Quantitative research is what we’re talking about with surveys of sufficiently large, randomly selected samples of the population you want to understand. What’s sufficient? The more you get, the more confident you can be in the responses, but generally you probably want at least 200. Survey research is a serious scientific discipline, and entire books have been written on it, but probably the most important thing I can tell you is, try to set aside your biases as best you can and seek the truth with your questions. Ask questions that your respondents can reasonably answer. For example, don’t ask how much they’d be willing to pay for your new cool thing, because, while they will try to give you a guess, they really have no idea. Ask how likely they’d be to pay $X for it. Then, they can answer with some grounding in reality.
CI: Too many companies assume the loudest voices in the room speak for everyone or that a one-off conversation means you have a problem. How can you avoid taking this feedback as gospel?
KJ: Excellent question. As a researcher, I like asking questions, so that’s what I do in this conversation. “That’s an interesting perspective that you heard this person say X. I wonder how common that is? Do most people feel that way, or only a few?” An honest response would be, “I don’t really know.” That can be the cue to go find out. Of course, honest responses are less common than we’d like. We often get: “Oh, I hear this all the time.” A good response might be, “What if we take action on that, and it’s not really the majority view? What might we be leaving out, or missing out on? What other opportunities might there be? How common are they? How expensive is it to solve that problem you just raised, and are there other things we can pursue that are less expensive, or less risky?” Try to get some different possibilities on the table, and then see if good research can help you choose among those possibilities.
CI: Excellent advice. How about demographic data—how much is too much to ask for?
KJ: Generally speaking, ask what you think makes sense to slice your data by, but no more than that. The more questions you ask, the longer your survey will be, which means people will be less likely to stick with it. Also, you never want to start with really personal questions; people are very sensitive to privacy concerns today. Questions about their income, ethnic background and (nowadays) gender and orientation are best addressed, if they’re truly needed, after your main questions, when you’ve established in their minds that you really are conducting meaningful research and care about their responses. For sensitive questions, always include a “prefer not to answer” choice. If you don’t, they won’t answer anyway—by exiting your survey.
CI: Do you get better results if people don’t know who’s behind the survey?
KJ: This is really a two-edged sword. It’s generally known that people tend to give better-quality answers if they don’t know who’s asking; even unconsciously, respondents can bias their answers based on their experience with you if they know who’s asking. The other edge, though, is that people are more likely to choose to participate when they know who’s asking. Anonymous surveys generally get perhaps 80 percent fewer responses than those where the sponsor is identified, because people have no good reason to give their opinions. Usually, I think it’s best to identify yourself, and in the invitation, include language like “We really are looking for your honest, candid feedback. Your responses are confidential (or anonymous) so we can hear what you really think.”
CI: Survey fatigue is real. What are your best tips for getting people to cross the finish line?
KJ: There are some pretty straightforward tips for this. First, keep it brief. We used to say that 15 minutes was as long as you could go, but it’s now more like 5–8 minutes. Be ruthless with your content. Ask what you need to know to make the decisions in front of you, and avoid the temptation to ask other stuff “as long as we’ve got them.”
Second, avoid long grids of questions. It’s easy with most survey software to put in a grid of questions using the same scale (very important, somewhat important, etc.). The temptation there is to keep adding different things to the list, but long-grid questions are daunting, and they make people quit. They also don’t display well on mobile devices, which is where most consumer surveys are completed. Mobile devices also don’t show anything with more than four or five scale points well, so skip the 10-point scales. They have worse predictive value than four- or five-point scales anyway, and they look terrible.
Have reasonable organization to the survey so that things flow well. You might have sections that deal with your product, your service and an “About You” section. Breaking things up in an intuitive way helps give people a sense of accomplishment and helps them want to finish the next sections until they’re done.
Short questions are better than long ones. You may be thinking about all the aspects of your product or problem, but your survey respondents are not nearly as interested as you are. Put yourself in their shoes and ask questions as simply as possible. Test them with someone who isn’t as engaged as you are (like a relative) and see if they’re understanding the questions the way you meant them.
CI: What is scale bias, and how do you overcome it?
KJ: There is just no way I can address this broadly; it’s a big subject and one I struggle with today. I addressed a bit of it, regarding NPS, in this post in the paragraphs “Apples to apricots?” and “Bias is practically guaranteed.”
CI: Okay, it’s pretty clear that surveys are hard to pull off. When should we call in an expert?
KJ: This is largely about the consequences of getting it wrong. If the results are really flawed because of things you don’t know about research, what’s the outcome? If it’s “aw shucks, we can’t use these findings, guess we’ll try again,” then no real harm. If it puts your business or your next round of financing or your own job at risk, it might be time to call in a professional.
CI: What should you look for in an expert? And what should you expect to pay?
KJ: Research professionals generally tend to have expertise in one or a few industries; it’s hard to be conversant in customer needs and choice factors in accounting software, SUVs and kitty litter. If a research agency’s website says, “we can do it all,” it’s probably a huge company, and you’ll never matter more to them than what you can pay. Consider a smaller agency with category knowledge appropriate for your business.
Recognize that a good researcher will want to really understand what you’re trying to accomplish, so they’ll ask you a lot of questions, not all of which you’re expected to know going in. If they ask three questions and promise you a bid, they probably don’t know enough about your business to give you truly useful information. You’re looking for curiosity and competence in a research agency—this is a consultant, not a mechanic.
If you’re serious about a given research agency and want to be sure, ask them to give you a de-branded example of their work. This will help you know what they can do for you, and what you’re paying for.
Research projects vary widely, of course, but to give you an idea: For a focus group, you’re probably looking at about $8,000–10,000 per pair of groups (and you always want groups conducted in pairs of similar makeup). For one-on-one interviews (we call them in-depth interviews, or IDIs), for consumers they’re going to be $200–$300 each, ranging up to $500 for business customers. As mentioned before, probably at least 10 and more likely 20 are needed. For surveys, there are many factors affecting the costs, but you’re probably looking at $15,000–$25,000 for a consumer survey. For any of these, those prices will include figuring out the line of questioning, developing the actual moderator’s guide or survey questionnaire, getting the people to participate (which often involves incentives), analyzing the data and giving you a solid interpretation of the results and what they mean for your business.