market research

Insights vs. Metrics: Finding Meaning in Online Qualitative Research

The use of projective techniques in qualitative marketing research has become an accepted as well as expected practice in the industry.   Focus group discussions and in-depth interviews (whether face-to-face or online) are particularly suitable for activities that go beyond the question-response format.  There are any number of reasons for using projective techniques but they essentially boil down to something similar to the statement from AQR: “What these techniques have in common is that they enable participants to say more about the research subject than they can say spontaneously, accessing thoughts, feelings or meanings which are not immediately available.”  Or, something along the lines of tearing down walls as from Applied Marketing Research: “Projective techniques are important in breaking through the wall of rationalizations consumers use on a daily basis to justify the purchase or likes/dislikes of products or brands.”

Projective techniques come in a variety of flavors.  In addition to those listed on the AQR site – collage, personification, bubble drawing, role playing, etc. – there is also guided imagery, picture sorts, sentence completion, tarot cards, and more.  The types of projective techniques used by researchers has grown over the years (and continues to grow), primarily because many researchers believe (although, I am not one of them) that there is no limit to what is acceptable as a projective technique, and online resources such as Pinterest have broadened the projective possibilities.

Researchers have promoted and defended their use of projective techniques based on the ability to tap into the less-public portion of people’s minds and thereby gain a ‘truer’ picture Read Full Text

Defining “Marketing Research” by Scientific Principles

Terry Grapentine and Roy Teas advocate in the spring 2012 issue of Marketing Research magazine for a revision to the American Marketing Association’s definition of “marketing research.”  They argue that the current definition is not sufficiently grounded in scientific principles and is missing the all-important reference to theory which they consider a key component to “knowledge creation,” which in turn “is crucial in developing marketing strategy.”  Grapentine and Teas call on textbook authors as well as the AMA to integrate the idea of theory and theory development into their discussions (and definitions) of marketing research and thereby promote a theoretical perspective – along with more scientific thinking – among marketing researchers.

What I find particularly interesting in the Grapentine-Teas plea for a theory-based approach to marketing research is the defense they wage in support of their argument – specifically, the idea that “theory expands knowledge sources.”  What I like about this is that, by embracing the research tools and methods from various related disciplines, such as sociology, marketing research design can bring an elevated, “holistic understanding” to our studies.  And this is good because a more-encompassing way of designing marketing research addresses the fundamental objective of our research, which is to understand how people think and what motivates behavior.

Grapentine and Teas also talk about the potential “barriers” to their redefinition proposition; highlighting the anticipated negative commentary that a theoretical approach in marketing research is too-academic and/or too-expensive for the speed-over-quality mentality among many marketing researchers.  This indeed may spell doom for their effort, but their cry for a more scientific basis to our marketing research designs, even with the acknowledgement that compromise – between true scientific rigor and the reality of research in the corporate world – is inevitable, is very welcomed.

In June 2011 I wrote a blog post where, not unlike Grapentine and Teas, I argue for “Taking Research Design to Higher Ground” and wonder “why researchers continue with the long-standing habit of avoiding honest experimentation and debates regarding their research methods.”  I conclude:

“The marketing research industry is jammed with talented researchers who understand great research.  Yet industry researchers have historically found themselves trapped on a never-ending wheel chasing the next research assignment, sometimes at the expense of good design.”

Like Grapentine and Teas, I encourage marketing researchers to step outside their “comfort zone” and think first and foremost on the strength of their designs.  Even if practical considerations impede a scientific path, marketing researchers owe it to themselves and the end-users they serve to question every design in terms of its ability to return reliable, valid results.

The Impact of Visual Components on Online Survey Response

Although researchers are always looking for improved ways to design their studies in order to maximize cooperation and completion while minimizing item nonresponse or other sources of error, now seems to be a particularly good time to experiment with online survey design.  Just in the last week articles on the Web from Lightspeed Research, Greg Heist at Gongos Research, and others have talked about the growing problem of declining response rates to our online surveys and called for shorter, simpler, and more engaging survey designs.  Heist takes the idea of engagement a step further and contends that online research designs should strive to deliver “fun” to the respondent – “We’ve made the entire experience [of survey completion] about as much fun as a trip to the DMV.”

With survey length and grid questions contributing most to incompletes (see Lightspeed), it is reasonable to look for new ways to gain respondents’ attention and keep them attentively thoughtful throughout questionnaire completion.  Online platforms provide any number of optional features for incorporating “eye candy” into online surveys, including face scales and graphic images, as well as a slew of functions within a rich text editor.  All of which is great, as long as this visual stimulation doesn’t degrade the quality of the research; and, at the very least, researchers have some understanding how visual cues embedded in online survey designs impact response behavior.

A number of researchers have explored this issue.  Mick Couper, Frederick Conrad, and Roger Tourangeau Read Full Text