We’re filling out a survey from our homeowner’s insurance company. I guess they want to make sure that we’ve got ample coverage in case a party gets out of hand and we need to rebuild Rancho Deluxe. One of the questions reads as follows:
Percentage of the interior walls that are plaster
Hmm. Would that be the percentage based on the number of walls, the percentage plaster represents of square wall footage, or something else? After all, in a rectangular room, if one long wall is plaster, then the right answer may be 25% or it may be 40%. How accurate does this response need to be?
There’s actually an excellent business point contained in that silliness. It’s not enough to ask the right questions. We also need to ask them in the right way so we get the expected, actionable data. In the example above, while my answer isn’t a huge data set, when aggregated into the other data the company is pulling together, the sampling error will be larger than it needs to be since half the respondents can respond using one way to look at the question and half the other.
Obviously, it’s not just a lack of clarity that can affect the outcome and usefulness of your research. Asking leading questions which are almost certain to elicit a particular response is bad as well (do you do XYZ every day?). So can asking open-ended questions since there is no guarantee that anyone will focus on the specific area you’re researching. Then there are the folks who overlap responses (how old are you – 18-21, 21-30 – how does a 21-year-old respond?). Or ask loaded questions (how long ago did you stop beating your spouse?).
Asking questions is really important but asking badly structured questions is a waste of time. Clear?