Friday, November 14, 2008

What a Lamppost Uses a Drunk For.

A friend of mine told me that, though he received his law degree from Vandy, he had to do extra study at Harvard before he became a professor. I asked him why he had to go to Harvard and he explained that his bosses wanted to make sure that in a courtroom he could correctly identify who was the plaintiff and who was the defendant. So now you have your highbrow humor for the day. The appropriate reaction is a slight snicker.

My own education had its own gauntlet of sorts, though I imagine nowhere near as perilous as an LLM from Harvard Law. For me, it was two research classes required of all who studied psychology at the University of Georgia's very research orientated psychology program.

I took the first class, Research Design, in the summer, when I had quite possibly the sweetest living arrangement in town. I was a couple blocks from the building where I would be taking my one class for the summer, not paying for utilities and the cost...$50. No, that's not per month. That was the cost for the whole summer. And when you're saving that much money in rent, you have excess funds to spend on other college activities, like taking your best girl to the picture show and singing festive songs at the local orphanage. Yeah, right. We partied like the members of Motley Crue only wished they could.

The professor was one of those laid back PhD candidates who basically promised you'd get an A anyways, but please show up so his review board doesn't get suspicious. Despite our encouraged lackadaisical attitude, however, designing research was interesting. Between rising at the crack of noon and raucous summer bashes, we learned how to create studies, control variables and make comparisons. I wished my entire education would have gone just like that.

But then came the second class, Research Analysis. Our teacher was nice enough, although a bit strange. He was a PhD candidate who didn't really seem to like or be interested in people, an odd characteristic for a psychologist. The class was at dawn and by that time I had moved into a house miles away, meaning I had to fight the other twenty thousand students for the apparently 30 allocated parking spaces. And the class was horrible. With the analysis of research, all the energy and enjoyment of designing research was drained out into the nearby Oconee River. Instead of using research to investigate and solve problems, we now seemed to be finding ways to use research to create new problems, which solved very little. Except we could now point at the amazing complexity of the process we used to say nothing.

As I yearned for the summer days where we pontificated confounds and null hypotheses, I questioned why we were taught the process in two parts. It would seem more logical to teach the design and analysis of a research method, in addition to teaching a variety of methods, rather than break the process apart. Yes, to do that would have been logical. But to put windows in the psychology building with some regular pattern would have also been logical. Instead, the university asked professors if they wanted a window and if they said yes they got one. If they said no, no window and the resulting building looked like it was designed by 8 year-olds who just floated a keg of soda. Logic has nothing to do with it.

And so it often goes in the wild world of marketing research. The design of studies and the interpretation of results too often seem disjointed. And if that is not enough to question validity, many studies start out with no real defined subject, only to follow up the act with an ethereal interpretation designed to support a pre-existing notion. As the late, great David Ogilvy said, marketers often use research like a drunk uses a lamppost: for support rather than illumination.

I routinely see studies designed to measure broad and barely defined aspects with a tiny population of participants with obvious confounds, such as an Internet survey to study people who don't use the Internet.

Now for the record, I have worked with some of the most fantastic researchers in the marketing business. Yet even with great research, I have seen managers cherry-pick the results and ignore a huge swath of important data. I have sat in the room and watched my friend Jim Nelems tell a company's leaders to their faces what the issues are, only to watch said leaders either try to discount the findings or ignore them altogether.

I am no research expert. But I am practical and I don't commission research unless I have questions I want answered. And when I do get data, I don't reengineer the questions to fit the results. In the studies I have been involved with, I have observed some amazing technique and skill and, while I could detail all the things I have liked, I think this subject is best addressed by the strongest lessons I learned. These few tidbits stand out to me either because they were not what I was expecting or because they fundamentally shaped the role I see for marketing research.

Better road map than treasure map.
One particularly terrible mistake marketers make is assuming research will predict the future. Sure, we test products and flavors and advertising and that gives us directions for where to put money, time and effort. But what research cannot do is control all the variables. In the lab, a consumer can tell you they prefer a flavor of cough drop. In the real world, you don't know how far the distance is from their house to the drug store that carries your cough drop. This is not to say the research is incorrect. You asked the consumers if they liked the flavor and they said, "Yes". There is a considerable difference, however, between liking the flavor and buying the product.

In my experience, research does a far better job of defining the problem rather than offering the solution. This notion should affect where marketers place the role of research. In at least a few efforts, I have observed marketers develop a product and then attempt to use research to affirm certain notions. Instead, well conducted marketing research should have informed the creation of the product to put it more in line with consumer preference. While some companies do this, the vast majority do not.


Better fairy godmother then magic genie in a lamp.
The genie gives you three wishes and goes back in the lamp until the next wisher happens along. Similarly, plenty of marketers rub their researchers and ask a few questions only to stuff them back in the bottle. Research, however, can advise and inform on all phases of a product's life cycle. Research can flush out attributes most desired by the consumers. It can test positioning strategies and communications. Why put the genie back in the bottle when it can grant all these wishes as well:

Allows benchmarking and testing of strategies in development.
Strategy testing all along the consumer marketing spectrum.
Verifies that strategy from research is aligned and has not skewed.
The better idea is to use your research as a fairy godmother. She exists more like a counselor, helping to inform decisions and granting the occasional wish when needed. Just don't ask for the world and then ignore her advice. She might turn you into a pumpkin.


Doesn't always have to be in a room with one-way glass and lots of M&Ms.
The best researchers can pull data out of anywhere. They can read the b-roll from your corporate videos. They can make inferences from the distribution of your brand magazine. The can diagnose issues from past efforts. I have seen focus groups take place at a customer's house. I have sat in a restaurant and observed customers for an upcoming in-store promotion.

The belief that research must always look like research is a self-imposed restriction. While the controlled setting has its value, one should not eschew the whole other world that is out there.

In today's marketing world, boundaries are falling. Sales and marketing departments have not only signaled a cease-fire, they are beginning to work together. Finance and marketing are beginning to teach each other a common language. And so, the world of research should drop the curtain and begin to better integrate into the whole workings of an organization. Research is a primary window into the life of customers. For the customer-led firm, research is not just an expense or luxury. In fact, a lack of research can be a liability.

They still teach Research Design and Research Analysis separately at the University of Georgia, despite plenty of complaints. I told them that methods should be taught alongside data analysis. I think I made a decent case for connecting the means of getting and understanding data. I put all my concepts on a survey of the course.

A survey, I imagine, nobody ever read.

No comments: