The Marketing Research Shared Interest Group (SIG) of the Cincinnati American Marketing Association meets monthly to discuss industry issues, growing trends, techniques and methodologies. During the February meeting, Brian Lamar from EMI Research Services led a great discussion across multiple industry topics. One common thread across all key points: clients.
- How will marketing research fully shift to mobile first design? Overall, the industry seems to be very slow to adopt mobile first/device agnostic survey designs. In most cases if a survey is programmed mobile first it will look good on all devices. Client education and engagement is key to this shift. This includes sharing data such as panel joins via mobile devices and the dropout rate among mobile survey takers as well as requiring clients take their survey on a mobile device so they see what the respondents are seeing.
- Do new technologies eliminate the need for research expertise? New technology, including automation and DIY tool kits, requires less human interaction; however, this pushes responsibility directly to the client. Many areas of marketing research are as much an art as a science. Clients really need to think about when expertise is needed and the risk is too high to try to do things themselves.
- Do longer surveys capture representative sample? Today, some marketing research companies claim they can get good data from extremely long surveys, but are they really getting a representative sample? Are the respondents who agree to long surveys fundamentally different from those willing to take shorter surveys? Are respondents over-incented for longer projects thus creating additional bias?
- What is the impact of political polling on our industry? You can’t go a day without hearing something about the upcoming presidential election. With conflicting results, political polling can negatively impact the public’s perception of marketing research and influence future participation. Most people don’t see marketing research results, but the polling is out there so it often becomes the face for all survey research.
- How can we improve quality? Clients assume the quality is there, but they really need to ask questions and explore what quality checks are in place. Clients need to do this on all projects and not just when they get results they didn’t expect. A multi-strike approach is important with quality checks. Don’t throw a respondent out for one strike. It is important to look for a pattern, so perfectly good respondents aren’t accidently thrown out. Clients also need to realize the role that survey design plays in quality. Frequently, it isn’t the respondents that are the issue, but the survey creating bad behavior. How would you act after answering five long grid questions in a row?
- Is faster better? There is always pressure to finish research faster. It is always possible to throw a bunch of sample at a project and complete in a short time frame. However, is that the right thing to do? This is an area new research needs to be done in. In the past research on research has shown differences in the types of respondents who answer early in the field period and the potential to make some different business decisions.
- How can we identify bad research? Most marketing researchers can quickly think of examples of bad research: poorly worded questions, double barreled questions and unscientific news polls. Some of this is due to more non-traditional researchers designing the research, but it can also be the pressure to get in and out of field so quickly. One person talked about the lost art of the telephone briefing. This is where all the interviewers get together and go through the entire survey. They express concerns and bring up problems. Imagine if we did this with online surveys! We could greatly improve the quality.