As a lifelong market researcher, I couldn’t agree more with Robert Weissberg’s expose of the flaws of political polling (“Shadowmetrics,” February 1996). But Professor Weissberg did not include a list of embarrassing questions with which to attack spurious data, and so here it is:
1. Who is included in the intended target population? Age, sex, ethnic group, voting status, geography, etc.
2. How was the sample drawn? Phone books, membership lists, random numbers, etc.
3. What was the connect rate? The percentage of calls reaching a human.
4. What was the refusal rate? The percentage of humans refusing to answer.
5. What was the completion rate? The percentage of respondents who endured the (often overly long) interview all the way to the end. If the answers to the previous three questions were 33, 20, and 75, for example, the cud result is a survey based on the answers of only two out of every ten people on the original list.
6. What were the call back instructions to the interviewers? A properly designed study will include instructions to call back at least once, better twice, to reach the respondents originally targeted. Prompt substitution of “alternates” is a good way to lose control over the sample design.
7. Were any screening questions included, and if so, what were they? Screening questions will help determine whether the person reached actually belongs in the sample, and if appropriate, whether he or she has any knowledge of the subject of the interview. Examples: When did you last vote, if ever? What was your exposure to print and electronic media? What was your experience with a certain program/service/product?
8. How was the interview staff trained and supervised? At a minimum, the field supervisor should review the interview guide with the staff, or the electronic equivalent should take place. The interviewers must be supervised to avoid phantom interviews.
9. During what hours of the day or days of the week was the bulk of the data collected? If most of the completed calls were made during the day, the sample is heavily loaded with retired people, non-employed mothers with babies, and people on welfare and excludes the critically important group of wage earners.
10. What, if anything, is done with comments outside the script?
11. Was the script debugged (field tested) before the project was started? On how many targets?
12. What were the interviewers’ instructions once he or she was connected to the target phone number? Ask for the head of household, the man, the woman, verify whether the respondent lives there or is merely answering the phone, etc.
13. Ask for a copy of the interview script. Most polling organizations will refuse, as the script is a made-to-order guide for finding flaws in the research design, the findings, or both.
—Job Lulling Prak
Tucson, AZ
Leave a Reply