Who would take the time to complete a survey online?
How do you know what people say in an online survey is true?
Not everyone has a computer, so how representative can an online survey be?
Barely a decade ago, online surveys were a new tool. Today, more than half of all surveys of the public are done online instead of by telephone. There are tradeoffs either way, and both methods require vigilance in design and execution. We often prefer the online approach for speed, cost and flexibility reasons, but it is not without problems. Those questions above are important to be able to answer.
Our research industry association—Council of American Survey Research Organizations (CASRO)—focuses on these issues. The CASRO Online Research conference in March included an interesting discussion session with online panel respondents: “real” people sharing their thoughts and perceptions from the other side of the table—or the computer/mobile device.
Here are a few of the ideas that the online respondents shared:
- People participate in online surveys because they actually like giving their opinions
- People dislike long and boring question or answer sets, so it’s best to keep questions interesting and relatively short
- They prefer questionnaires that are easy to navigate online and intuitive to understand
- When answering a multiple choice question they find it annoying when answer sets do not include the option they want to use, so lists should be comprehensive but not exhausting (a delicate balance)
As research practitioners, we want to be comprehensive in our questioning approaches. The panel of survey respondents reminds us to consider the human perspective and avoid creating surveys that some might call tedious rather than thorough.
This is a reminder of a critical requirement in any survey, regardless of data collection method: keep the balance between an interesting study (for the respondent) and a valid set of responses (for us).