Loading...
The Usability Blog
A Practical Guide to User Experience Insights

Fundamental Best Practices of an Online Survey

Online surveys can be a valuable and effective part of your research efforts. Often used as a way to gather quantitative data, online surveys provide the means to gather participant demographics, opinions and ideas. These surveys are self-administered and provide an alternative to using a more structured, moderator-based methodology.

Last year, we conducted a Webinette that demonstrated some Do’s and Don’ts with creating online surveys. This year, we are providing similar guidelines (with more detail) in this month’s newsletter article.

  • Start with clearly understanding the research objectives.Specifically, you need to know how the data will be used and who will make use of the results. It’s also important to understand what action will be taken based on the results of each and every question. With this in mind, assemble a manageable group of well-qualified stakeholders to identify goals and contribute to survey content.
  • Keep your research objectives in mind when forming the questions. Your objectives should be your road map. If a question does not directly support a learning purpose, it should not be included. And, although your objectives will ideally govern the number of questions, avoid asking too many. Copious questions will cause participant fatigue and imminent bailout!
  • Use the right question type for the data you want to gather.There are several basic types of questions with varying reasons to use them. Some of the most popular:

          a. Closed-ended question – This type of question has a predetermined set of answers from which the respondent can choose. The benefit of closed-ended questions is that they are easy to categorize and are often used in statistical analysis. The disadvantage is that they are more difficult than open-ended questions to write; they must include the question text and all the logical choices participants could give for each question.Two common types of closed-ended questions are:

          b. Radio-button question – Where participants are asked to choose only one selection from a list of options.

          c. Checkbox question – Where participants are asked to choose all selections that apply from a list of options.

          d. Open-ended question – This type of question gives participants the opportunity to answer in their own words.

Keep in mind that while responses to open-ended questions can be very valuable—and often even quotable—they can also yield vague responses that are difficult to interpret and categorize.

          e. Rating-scale question – This type of question is often used in lieu of a flat yes/no, ‘agree/disagree’, or ‘not satisfied/satisfied’ question type. In other words, it enables participants to add nuance to their opinions.

When creating a rating scale, it is recommended to order the rating choices from low to high, or left to right. Also, it is important not to use rating scale questions that people could have a difficult time interpreting and therefore answering appropriately. Ensure rating scale questions can be easily understood by participants by using the appropriate number of points on the scale (level of granularity). Additionally, label the points clearly, especially on longer scales.

If the question does not support a high level of granularity, then use a smaller scale. Also, if you’ve used a specific scale in past research, you will want to use the same scale to be able to directly compare past data.

  • For radio button and checkbox questions, streamline the number of answer choices. For both radio-button and checkbox questions, avoid offering too many or too few choices. A good rule of thumb is to prepare a list of the most popular 6 to 10 choices with an “Other” and a “None of these” option. (It is a good idea to allow respondents to write in an open-ended response if they choose “Other.”)
  • There are also occasions when it is appropriate to include a “Prefer not to answer” choice when content may be more personal in nature. The bottom line is NOT to leave your participants hanging with questions because they don’t have the knowledge or experience with the choices offered, or are just unsure how they want to answer.
  • Write simple, concise questions.  Don’t get long winded. Remember, the goal here is to not make your participants struggle so keep wording friendly and conversational. For example, let’s say you own a men’s clothing boutique and you want to know where your visitors shop for neckties. Do not use industry terms, or wording that you wouldn’t use in everyday conversation:

  • But, don’t compromise clarity. Here’s an example. If you are building a survey to find out about the effectiveness of website navigation, you may want to find out more about the search feature. If that is the case, you may be inclined to construct a question like this:

But here’s where is begs clarity. Many will misconstrue the term “search”. Sure they “searched” for a product; they browsed around, navigated from one area to another in search of the right necktie. But, what you really want to know is how useful thekeyword search feature was.

  • Avoid two-faced questions.  Be sure your questions don’t need more than one answer. For example, if you are asking participants how often they shop for ties and belts, they may not be able to answer since they probably shop for one item more often than the other.

Easy enough to correct but just be sure to include more than one question if there is more than one possible answer.

  • Avoid answer choice overlap.  Be sure choices don’t conflict with one another. This is a fairly common oversight, occurring more often than you might think. Look closely at the following question examples. Which one would you use in your survey?
  • Last, but certainly not least, DON’T fail to proof carefully. Spelling and grammatical errors present an unprofessional image so ensure you dedicate ample time and resources to proof and validate all content. A short checklist of online survey proofing procedures may help:
    1. Verify you’ve included the right questions to fulfill objectives
    2. Check for and eliminate question redundancy
    3. Always run a spell check
    4. Read the questions aloud when proofing
    5. Check logic for the appropriate actions
    6. If possible, ask someone who has not been involved in preparing the survey to take the survey

Considering these basic best practices when designing and constructing your online survey will facilitate good response rates and help ensure you don’t compromise data integrity.

Hillori Hager, Online User Experience Project Manager, Usability Sciences

 

BECOME A PAID TEST PARTICIPANT

Sign up to become a Paid Test Participant.

Sign UP Now

We have revised our Privacy Policy
as of August 19, 2014

CLIENT TESTIMONIALS

“From beginning to end, everyone I interacted with from Usability Sciences was professional and thorough. I was impressed with the testing technology, the methodology and especially the team that led the project. This is one of the most impactful pieces of research I have ever delivered to my team. Thank you!”

Kevin King
Senior Director of Digital Media, A&E Television Networks

“USC managed tight timelines and a client team that was tough to wrangle, But more importantly, the quality of the work was exemplary. It's work I would hold up as "the way we should do things" and share as a case study across the organization.”

Group Product Director
Digital Marketing, Pharmaceutical Company