Conducting Surveys

Institutional Research and Planning (IRP) has developed this guide to help you plan and create a Web-based survey that will provide you with useful and reliable results. The guide contains tips drawn from our survey experience and will help you design an effective survey, prepare your participant list, write convincing email invitations and  make the most of the resources available to you.

Clarify your survey’s objectives

The challenge consists of avoiding long questionnaires and thus including only those questions that will help you in making decisions. The length of your questionnaire could significantly affect your response rate and the proportion of incomplete surveys. A low response rate and a high incomplete rate increase the risk of obtaining biased results. According to some researchers, the questionnaire should not take more than 10 minutes to complete (Crawford, Couper and Lamias, 2001), while others set 20 minutes as the limit (Gunn, 2002).

Draft your questionnaire

First, draft your questionnaire in a Word document. This will give you a chance to reflect on the content and structure before getting into the more technical aspects of creating the survey.

Your questions should be formulated in simple and easy-to-understand terms. If you are unsure whether participants will understand your questions correctly, do not hesitate to try them out on a few people in your survey’s target group. 

It is also important to ensure a high quality of language, in both official languages. Grammatical errors tend to undermine the survey’s credibility and the University’s reputation in general. If necessary, you may contact the University’s Language Services to find out what resources are available and the cost for various services.

Here are a few practical suggestions for drafting your questionnaire:

  • Begin with short, easy‑to‑answer questions. When possible, place the open‑ended and more sensitive questions at the end of the survey (e.g., questions about income), to avoid discouraging respondents. 

  • Avoid using too many open‑ended questions. They can provide a wealth of information but they require considerable effort from respondents. They also produce results that are difficult to quantify and you will have to devote considerable time to analyzing them. 

  • Plan to use skip patterns so that respondents do not see questions that do not relate to them. For example, if you ask them if they are employed, make sure that only people who answer “yes” see the questions related to being employed.  

  • Make sure that the response options for multiple‑choice questions are mutually exclusive; in other words, there should be no overlap between the different options.

  • For greater clarity, divide your questionnaire into different sections, giving each a heading that reflects the themes covered. 

  • Consider other possible data sources in order to reduce the number of questions.

You are in the best position to judge the usefulness and relevance of your questions. However, IRP can review your questionnaire and send you its comments. If you are interested in this service, contact us atrechinst@uottawa.ca or at 613-562-5954.

Define your target population and prepare your participant list

Your target population is the group of individuals on campus for whom your survey is intended. It could be undergraduate students, graduate students, faculty members, support staff, or a combination of these groups. The group selected could also be more restricted depending on your survey goals (e.g., first‑year students only, students currently registered in a particular faculty, etc.).

Given the observed decline in response rates because of survey fatigue, we strongly advise that you use a sample rather than invite all of the target population to participate, especially if the target population is students. This way, you will be helping to reduce solicitation on campus, which is in everyone’s interest. In any case, asking the entire population to respond will not improve your response rate because the rate is based on the number of people invited to participate. However, it may be appropriate to invite all members of a particular group to participate if the group is small (e.g., first‑year students enrolled in a specific faculty). 

Your sample needs to be representative of the target population if you want to be able to generalize the results. This means that the characteristics likely to influence the results—such as gender, language, status (full‑time, part‑time)—must reflect those of the target population (e.g., 60% females, 40% males). 

Establish your timeline

The date you choose to launch your Web survey is also likely to significantly affect your response rate. Needless to say, you should avoid the end of an academic session and exam periods. We ask that you consult the Institutional Research and Planning’s (RIP) Survey Schedule when choosing your dates. To avoid any unpleasant surprises, make sure you allow enough time to prepare your material, obtain the necessary translations and test the Web‑based survey and the mechanism for sending your invitations.

To avoid any unpleasant surprises, make sure you allow enough time to prepare your material, obtain the necessary translations and test the Web‑based survey and the mechanism for sending your invitations.

You should also decide how many invitation messages will be sent. Reminders sent at intervals of three to seven days are generally effective in increasing response rate. The general norm is to send no more than three participation messages in total. It is important to send reminders only to those people who have not yet completed the questionnaire

Draft your invitation

Your invitation‑to‑participate messages need to be short and convincing. Briefly explain the purpose of your survey, how to participate, what incentives are offered (if any), the deadline for responding and who the contact person is for any questions. As with the questionnaire, check the message’s language quality and allow time for editing and translation.  If necessary, you may contact the University’s Language Services to find out what resources are available and the cost for various services.

The invitation should also contain an URL address where participants can access your questionnaire, along with an identification number or password that they must use. If possible, make it easier for participants by using personalized URLs so they can access the questionnaire directly. A unique identification number for each participant will be inserted in the URL included in the invitation. This will avoid problems arising from the incorrect copying of the identification number and/or password.

Be careful about what you use as the subject line of your invitation. If you use expressions like “win a prize,” “help us” or “take part in a draw,” your message could be blocked by the anti‑spam filters. We suggest that you run some tests before launching the survey. If emails are not sent from an internal University of Ottawa workstation, we also strongly recommend notifying the Service Desk at 613-562-5800 extension 6555 a few days before sending the invitations. Please ensure that the ticket opened for your request is assigned to the “INF-Collaboration” team.

Some survey programs allow you to insert in your invitation message a link that potential participants can click on to remove their name from your participant list (“OptOut” function). This will prevent reminder notices from being sent to people who are not interested in participating, thus avoiding frustration.

Here are a few messaging elements that are likely to encourage participation:

  • Signature: Response rates are generally higher when the people invited to participate are familiar with the person signing the invitations. It could be an individual or service.
  • Length: Indicating how long the questionnaire is likely to take has a positive impact if the questionnaire is relatively short. However, both ethically and strategically, it is not recommended to deliberately underestimate the response time required.
  • Confidentiality statement: Reassure participants by indicating that responses will be kept confidential (this implies that no identifying information will be disclosed with the participant’s responses). However, avoid raising concerns by using an overly detailed confidentiality 
    statement. 
  • Draw: It is quite common to offer incentives in the form of a random draw as a means of thanking participants and increasing response rates. A number of studies have examined the effectiveness of this strategy. The findings are generally disappointing (Porter and Whitcomb, 2004; Warriner et al., 1996). Furthermore, the relationship between the value of incentives and the response rate is not linear; in other words, higher values do not necessarily produce higher response rates. Consequently, any prizes offered should be selected on the basis of their potential appeal to the target population and your budget.
  • Benefits: You can also encourage your potential participants to respond by explaining why their involvement in the survey is important, what the benefits will be to them, or how the results will be used. 
  • Personal greetings: There are a number of software programs that allow you to use personalized greeting formulas (e.g., “hello [first name]”). Given the vast volume of personalized spam messages, the effectiveness of this measure in increasing response rates is somewhat uncertain, but you have nothing to lose by using it.
Create your Web-based survey

Select the software or supplier

The FluidSurveys tool is available to the University of Ottawa staff members (academic and support staff). For more details, contact Service Desk by phone at extension 6555.

You may also decide to use a different software or an external supplier, while assuming the related costs. If you decide to use an external supplier, make sure that you have the supplier sign a confidentiality agreement before forwarding any files containing personal information, such as names and electronic addresses. IRP may provide you with a confidentiality agreement template to be updated by modifying the text in yellow using the information specific to your survey initiative. After updating the agreement, please send it to the Office of the Vice President Governance at  vr.gouvernance@uottawa.ca  to have it signed by the Vice-President and the President. Only the Vice-President, Governance and the President are authorized to sign legal contracts in the name of the University.  

Tips on creating your Web survey

Here are some tips on creating your Web‑based survey that will help you make it easier for participants to respond and thus increase your chances of receiving a greater number of completed questionnaires:

  • Restrict access to your survey to people on your participant list. The goal is to obtain a response rate from a representative sample and not to get the largest possible number of responses. 

  • It is also recommended that participants not be required to answer one question before moving on to the next one (Umbach, 2004). This could mean that you will have some missing information but forcing participants to answer will not solve the problem because they may be just as likely to close the questionnaire. From an ethical perspective, participation should be voluntary, which means participants have the right not to answer all questions or to leave the survey when they wish. 

  • Use a simple, easy‑to‑read format similar to a paper questionnaire, without too many colours or images.

  • Provide clear, detailed instructions that make navigation easy (e.g., how to move to the next page, how to submit the questionnaire, etc.) and explain how to answer the questions (e.g., “Choose only the most important reason” vs. “Choose all applicable reasons”). 

  • Wherever possible, participants should be able to see the next question or to access navigation buttons without having to scroll down the page.

  • Avoid drop‑down menus and use multiple-choice questions instead; this way all respondents see all the options available without first having to click on an arrow. 

  • To keep respondents motivated, add a progress bar to your questionnaire. This could have a negative impact however if the questions requiring the greatest response time (open‑ended questions, matrices with many statements, etc.) are placed at the beginning of the survey.

  • Run some tests to see how your questionnaire looks on different operating systems, screens or browsers. Trial runs are also useful to ensure that any skip patterns in the survey are working.
References

Crawford, Scott D., Mick P. Couper and Mark J. Lamias (2001), “Web Surveys: Perceptions of Burden”,Social Science Computer Review, Vol.19 No.2

Hunn, Molly (2002), “Web-Based Surveys: Changing the Survey Process”, First Monday, Vol.7 No.12.

Porter, Stephen R. and Michael E. Whitcomb (2004), “Understanding the Effect of Prizes on Response Rates”, New Directions for Institutional Research, Vol.121

Umbach, Paul D. (2004), “Web Surveys: Best Practices”, New Directions for Institutional Research, Vol.121

Warriner, Keith, John Goyder, Heidi Gjertsen, Paula Hohner and Kathleen McSpurren (1996), “Charities, No; Lotteries, No; Cash, Yes:  Main Effects and Interactions in a Canadian Incentives Experiment”, Public Opinion Quarterly, Vol. 60

 

Back to top