Survey Best Practices
Design Filtering is a value-added benefit of utilizing TheWGroup.ca to design your community-based and internal surveys. Design Filtering incorporates a number of key functions including but not limited to:
Design Best Practices
Let’s focus on Political Sensitivities. This may be the single greatest challenge in municipal survey research. It is a multi-faceted challenge that includes everything from the:
timing of the survey’s release,
duration the survey is open,
type/nature/wording of the questions
control used for multiple submissions
analysis completed on the collected data
personal information collected (if any is collected as part of the process, i.e. postal code is deemed as personally identifiable information)
manner in which the survey results are presented/released
1. Timing of the Survey’s Release – This may influence who sees the survey first and how the survey is perceived in the community, especially if the issue might be contentious. Releasing the survey first thing in the morning the day before the local newspaper goes to press may create extra attention for the survey but will that attention be positive or negative? Who or what groups will be interviewed and what position will they take? You may gain the attention but garner biased respondents based on which way the media has interpreted and reported on the survey.
2. Duration the Survey is Open – Is there a possibility that a related story or news item will break while your survey is in field causing the results to be skewed one way or another or called into question?
3. Type, Nature and Wording of the Questions – Careful consideration needs to be given to each question and how they flow together in the survey as you want to prevent anyone from saying that the questions are leading or are leaning people to respond in a particular way. (Note – language – this is covered in another section, however important to point out here that they use of language unfamiliar to residents will tend to discourage their participation and alienate them from the process.)
4. Control Used for Multiple Submissions – While we may think that surveys are boring and who would take the same survey twice, it is something we must consider and plan for. Especially, if there is a controversial topic with the likelihood that there will be a volume of residents for or against the topic being studied. This can either be organized (i.e. cycling groups versus local residents) or merely organic. The best way to protect against this is by using a Community Engagement Research Panel as the control for each response – only one per member per survey. Open Community surveys are different in that you can restrict a single submission per IP address however, with this you are causing problems for people using shared computers (libraries, recreation centres, homes, offices, and even your own staff using iPads to collect responses) and of course, then there is the issue of
tracking an IP address and the Freedom of Information and Protection of Privacy Act legislation in many Canadian jurisdictions. There are some work arounds, though they are specific to each survey.
5. Analysis Completed on the Collected Data – How is the data cleaned? Are any responses removed or omitted? If so, based on what partial completes, lack of response to certain questions, etc.? The survey data analysis should be able to pass any scrutiny placed on it.
6. Personal Information Collected – Often it is an innocent inclusion, asking for the respondents address or name. (Much like has been the practice at community open houses for years – the sign-in sheet asking for first and last name, address, email address, phone number and a signature). All of this practice goes against the Freedom of Information and Protection of Privacy Act, unless there is clear notification language for its collection and a reason why it needs to be collected.) Whether your municipality routinely completes Privacy Impact Assessments for surveys or not, the impact of asking for a respondent’s name and/or address in any survey is significant and can cause great concern for the respondent and possibly you, if called on it. By nature, most surveys are conducted anonymously; it is the common practice most of us expect. So, when we flip to a page where we are asked to add our name, many of us automatically think about what personal opinions we shared and how when those opinions are tied to our name, they might be somehow used against us. While this maybe less impactful, asking for some demographics may also create the same feeling. This can be overcome with proper introduction to demographic questions and the use of a FIPPA statement (where necessary).
7. Manner in which the Survey Results are Presented/Released – If the survey forms part of a Corporate Report head for Council ratification then the timing of the survey results release is tricky. Does a municipality release the results when they are prepared to or wait until they are presented in support of the Corporate Report? There are some dangers in both choices: release early and have a group opposed to the project misinterpret or miscommunicate a portion of the survey results and use them against the projects successful conclusion, or wait until presented to Council and possibly have the same groups argue that the municipality is hiding the survey findings. Keep in mind that not everyone possesses the ability to fairly and accurately interpret survey results.
Post Script: Ultimately, a community’s public trust is strengthened or weakened with every interaction it has with its residents and community as a whole. Even though reaching to conduct public consultation is at its core a very open, democratic process, it can, if not done with all due care and caution, undo public trust.