(Originally posted: Friday, December 13th, 2013)
In Mail and Internet Surveys, the authors state that surveys consist of two languages: the language of words and the language of navigational layout. This post addresses the former. In Asking Questions, the authors define surveys as “a conversation with purpose.” In creating a survey, you must pay close attention to the precise wording of questions. Below are some guidelines, mostly adapted from the two books referenced above and my personal experience in working with surveys.
1. Don’t force people to provide information they don’t have.
Survey questions are dependent upon social desirability bias – people want to look good, even in anonymous surveys (see Asking Questions for more about this). If you ask students to report their parents’ income, include a ‘don’t know’ option so students know it is okay to not know. Otherwise, they could skip the question, make up an answer, or end participation in the survey. If you feel like your survey questions are very sensitive and have a reasonable suspicion that people won’t provide reliable information, don’t waste people’s time with a survey.
2. Avoid double-barreled questions.
Here is an example of this common mistake: “The Student Union cafeteria has the best food and coffee on campus.” I really like Dunkin’ Donuts coffee, but I don’t normally eat there (I do like the food). If I felt the same way about the Union cafeteria, there is no way one could answer this question (unless I felt that the student union has the best of both).
Here is another example: “Provide a rating of the vendor’s presentation for content, transparency, clarity, and organization.” Presentations can be well-organized, but have bad content. The present could have been very transparent, but not well organized. These should be four separate questions.
A final example: “How can the admissions application process be made simpler or more valuable?” This question implies that just because something is simple, it has more value. It also assumes that in order for something to have value, it must be simple. Neither of these assumptions are remotely true all the time. The solution in the first two examples is to create multiple questions.
3. Avoid abbreviations and obscure words.
I once took a survey about how I use my bank. One of the questions asked about how important I felt “ACH services” are. I have absolutely no idea what ACH services are or what the acronym even means. This might be fine if there was a “don’t know” or similar option, but this survey only allowed five options, on a scale of “extremely important” to “not at all important.”
Don’t take what people know for granted. I would even avoid acronyms that may be assumed to be universally known, like NCAA or CIA. You are going to have a lot of people like Hank, the Out of Touch dad taking your surveys, and you don’t want to dismiss their input:
4. Tailor questions to the frequency of behavior.
Consider these questions:
– How many times did you meet with your academic adviser last academic year?
– How many parking tickets did you receive in your four years of college?
– How often do you brush your teeth in any given week?
Think about the different mental processes involved in these questions. For the first question, people will have to think about the entire year, retrieve the information from memory, and count. For many people, this is simply too much work. A solution is to provide several number ranges (0, 1-3, 4-6, etc.). The range is good because it can aid in recall and helps in estimation.
As an aside, I once consulted with a program where the individual was adamant about including a listing of numbers (1,2,3,4, etc.) as opposed to ranges because he wanted to be precise and specific. I explained that this would not help respondents because it forces respondents to count, and they could feel pressured to give a precise answer, causing them to skip or make up an answer so they don’t appear to be wrong (see the social desirability rule).
As the authors in Asking Questions explain, brushing one’s teeth is a pretty routine activity for most people. Thus, people might think about the past few days or week. With questions like this, many are tempted to ask about the past month or even year. Don’t do it. Very few people are going to think about the year – most will think about the past day or week, and multiply anyway.
5. Avoid loaded questions.
Consider this loaded question: “Tuition has been raised by 127% in the last 10 years. Do you favor allowing the administration to raise tuition, or should the board listen to student concerns and lower tuition to save students money?” By disagreeing with this question, one is saying they don’t want to save students money. It’s also loaded – maybe tuition was flat the last five years.
Here is a more subtle hypothetical one: “Do you favor streamlining business processes, specifically by implementing the new XYZ technology solution?” I don’t know too many people who are opposed to streamlining business processes, but there may be variable opinions about the specific type of solution, many of which have nothing to do with technology. If one answers no, it implies they are against streamlining business processes. The bottom line is that you shouldn’t lead respondents to an answer.
6. Think carefully about using open-ended questions.
There are a lot of obvious advantages to using open-ended questions and they generally provide great information. There are also problems with open-ended questions. I am of the opinion that, in an online format at least, they should be used sparingly. Look at these two questions:
– Please describe all the ways that you have used the advising office in the past year.
– Tell us why you selected to attend this university.
One can get a lot of detailed and rich data from these questions. Think about all of the economic, sociological, psychological, financial, and emotional factors that go into selecting a college. However, that is also the second question’s downfall. It can place too much burden on the respondent’s memory, and requires them to think too hard. They take a lot of time, which limits the space for asking other questions. One solution around this is to give the respondent a list of cues or choices, and follow-up with a comment box.
If the goal of your evaluation is to understand why students choose a certain college, then you would probably get better answers through an interview or some kind of structured conversation. If the goal of the evaluation is to get feedback about an entire group of students in order to make broad program changes, then a quick survey is probably better. Open-ended questions are certainly appropriate, but again should be used sparingly.
6. Try to avoid negatives and especially double-negatives.
This should be obvious, and here is an example: “There are not many fun things to do on campus that do not involve alcohol.” Imagine how many times a student would have to read this to indicate whether they agree or not. A better question might be: “There are many fun alcohol-free things to do on campus – agree or disagree.”
7. Put the easiest questions first and sensitive questions last.
This eases people into the survey. If they are bombarded with a sensitive question first (i.e., “Have you ever been arrested?”), they may feel nervous about the rest of the survey. An exception to this rule might be if you plan on screening people based on a criteria for the rest of the survey (e.g., people who have been arrested versus those who have not been arrested).
8. Try to list answers vertically, not horizontally.
Here is an example: Please indicate your year in school: ____ First-year ____ Sophomore ____ Junior ____ Senior
The problem should be obvious. Some people are going to place an “x” in the wrong spot.
9. Try to avoid words in questions that have subjective or vague meanings.
Here is an example from above: “How can the admissions application process be made simpler or more valuable?” There are subjective meanings about what constitutes value and what constitutes simple. Here is an example:
The 5K race was simple and valuable. For me, a 5K race is difficult, but very valuable. But for a seasoned marathon runner, a 5K race is very simple, but I am not sure if it’s valuable. Additionally, they may find it a waste of time and thus will not place much value in it. Value is also a broad term in terms of what it constitutes. Was the race valuable because it raised money for a good cause, or valuable to myself because it was an incentive to exercise?
Here is another example. I know absolutely nothing about wood-working and am not generally a handy type of person, but several years ago I built a deck just to learn. It took me about twice as long as a more experienced person, and it was definitely not simple or fast. I spent the entire summer working on it, and digging the post-holes was very hard work. So, it was not simple and not easy, but I consider it a very valuable experience. The point is that we should not let our own assumptions about what constitutes value, or even simplicity, drive a survey question. Again, keep your will out of the survey design process.
Here is a final example:
|Frustrating||Requires Little Effort|
|The application process is:||1||2||3||4||5||6||7|
The assumption made in this question is that if something takes effort, it is naturally frustrating. While some or even many people may feel this way, it may not be true for everyone.
One may think the solution is to just make the criteria the same (highly frustrating to not frustrating at all). This does not solve the problem – frustration is a subjective term. I don’t mind waiting 60 minutes for a table at a restaurant, but anything above 15 minutes is incredibility frustrating for others. Here is a solution. Rather than ask about level of frustration, ask about wait time (”how many minutes did you wait”) and then follow up with a question about frustration.
Here is an example of a vague question: “How often to do you attend campus activities? Never, rarely, sometimes, often, all the time.” Twice a week might be rare for one student, and often for another. The solution is to ask “How many campus activities did you attend in the last month?”
10. Develop mutually exclusive responses.
Look at this question and the answers: “During your first year in college, where did you spend most of your time?” Potential answers:
– residence hall room
– computer lab
– friend’s room
– off-campus location
– residence hall study lounge
The problem should be obvious. A friend’s room could also be a residence hall room or an off-campus location. The computer lab could be in the library – which one should the student choose?
11. Try to put the “negative” answer choices first.
In general, try to put ’strongly disagree’ on the far left, and make the responses more agreeable towards the right. The rationale behind this is that people want to look good in surveys due to social desirability, and will be tempted to agree more than disagree. Putting the disagree or more “negative” options to the far left will force respondents to scan across, as opposed to just immediately agreeing.