Conducting effective online research with today’s consumers requires surveys that respondents can take when they want, where they want, and on their device of choice. This means your online surveys must work well on smartphones. But how do you go about this effectively?
Best practices for conducting online surveys:
- Design mobile first
- Keep the survey short: 12 minutes or less
- Limit the length of list options
- Reduce repetition
- Reconsider the need for grids and scales
- Ask meaningful questions
- Use clear and concise language
- Leverage open-ended questions wisely
- Use responsive survey designs
- Make it fun!
- Critically test the survey
1. Design mobile first
To access a representative audience online, it’s important to make your surveys accessible to people where they are – and that’s often on smartphones. Designing for mobile devices is more than getting a question to fit on their screen and overall compatibility, it’s making the entire survey-experience enjoyable on mobile device, so you receive the best possible results.
The best practices laid out below account for length, language, complexity and more – all of which can impact the rate at which people complete surveys, and the quality levels of data you receive back.
2. Keep the survey short: 12 minutes or less
Keeping your survey short can reduce survey dropouts. While smartphone and PC respondents alike might take surveys at home or in their office, a smartphone user’s attention is likely more limited than others. They may be interrupted by notifications from other apps, a TV programme starting, or being next in line at the checkout if they are out shopping. Whatever the reason, it is important to get respondents through the survey before they run out of time or patience.
There isn’t an exact time limit, but it is recommended to keep all surveys, especially those running on smartphones, under 12 minutes – ideally below 10. As you will see throughout our recommendations, survey design length has a large impact on keeping respondents engaged.
3. Limit the length of list options
Scrolling on a small screen can make it challenging for respondents to focus and follow. To avoid this, reduce the number of items shown in any list. This can be done by splitting them into logical groupings or only expanding an option list if the overarching group is selected.
Consider the purpose of your research and only include items that are relevant to your objectives. Perhaps group unlikely, low-incidence choice together into a single item.
Many survey questions include an “other, please specify” option. These can be useful, but only give the option if you are going to use the data collected from the open response.
4. Reduce repetition
Having to repeat yourself in a conversation is never fun. And the same goes for surveys. Many surveys contain banks of agreement or approval scales, which can be particularly tiresome for respondents and can lead to speeding, straight-lining or random pattern answering.
This will be compounded if a respondent doesn’t have much experience with the subject (see point 6 below) or if some statements seem to overlap or duplicate each other. For example, asking a respondent if they consider a brand as high quality, then asking if they feel it is a premium brand, can be seen as overlap and be frustrating for respondents.
Try to place limits on the number of iterations of a repetitive bank of statements and remove anything that is either not important to the research or is duplicated by other statements asked. A limit of 12 statements is a good target to aim for and will reduce noise in the data.
5. Reconsider the need for grids and scales
If you’re using grid and scale question types, think again. You might find that many questions don’t need to be asked via a scale at all. If during analysis you are only reporting on the top-two or bottom-two options, then consider asking the bank of questions as a pair of multi-code questions.
For example, asking a respondent first what they like about the brand followed by what they do not like. Another example is taking a traditional 11-point scale question on the likelihood of recommending a brand and asking, “would you recommend this brand to someone else?” with a much simpler yes/no single-choice option.
Removing scales and replacing with binary choices can greatly reduce the length and repetition of a survey and can also neutralise interpretation measures. For example, what might be a 7/10 rating for one respondent might be a 6/10 to 9/10 rating for another.
6. Ask meaningful questions
Only ask questions that respondents are qualified to answer. For example, asking detailed questions based on brand awareness might include several questions that the respondent cannot answer. Being aware of a brand does not mean they have had any exposure to that brands customer service or delivery options, and being asked questions on these topics may stall respondents, cause them to dropout or answering without having a relevant answer – all of which will impact your final dataset.
Consider whether a familiarity or usage question could be used to filter the follow up questions or whether the question is necessary to include.
7. Use clear and concise language
Consider who is taking your survey and how best to speak to them. We recommend simplifying the language and avoiding research jargon as fundamentals to aid engaged feedback. Keeping question text worded in a short and concise manner is also important, especially on small screen devices, as shortening the text will reduce scrolling and improve your data.
It’s also important to ensure respondents can navigate through questions with ease. Add instructional text only when it’s really needed to make a task clear.
8. Leverage open-ended questions wisely
Although open-ended questions can be used to provide valuable insights, these questions are more difficult to answer on smartphones due to the small screen size and on-screen keyboards. Ask such questions only as necessary and avoid using them consecutively, as they can lead to frustration and prompt respondents to drop out of the survey.
9. Use responsive survey designs
Use scripting tools that will intelligently adapt the layout of content based on the type of question being presented and the respondent’s screen size and device orientation. Intelligent layouts will not simply shrink a question designed for a laptop to fit on a smaller device – when viewed on a smartphone, intelligent tools will respond based on the device and screen orientation to provide respondents with best possible layout. This creates a better and easier experience for respondents – one where they can be considerate with their responses rather than focus on making sense of a question or locating the “next” button.
10. Make it fun!
Ask questions in a way that connects with respondents and makes them feel at ease to answer your survey openly. Explain the purpose of the research and thank them for their consideration and involvement, especially on serious subject matter. Utilise light-hearted or humorous language, images and even memes when appropriate. These can get respondents thinking and drive engaged, honest responses.
Compelling tasks are also a valuable tool. Mini personality tests or games are not only a fun format to complete, they are also a great way to encourage respondents to give detailed and meaningful responses. You’re collecting data, but they are also benefiting by learning a little about themselves and others.
11. Critically test the survey
Finally, take the survey yourself. It’s a simple recommendation but extremely effective. When proof-reading or testing a survey link, put yourself in the shoes of the respondent. Ask yourself the following: Do all the questions make sense? Have you avoided being repetitive or boring? Are you motivated to finish the survey and provide considered answers?
If the answer to any of the above questions is a “no”, then seriously consider what could be done to make the survey more enjoyable and encourage respondents to give better answers. Rewrite or remove questions or sections that don’t work.
Kantar is committed to working on best practice recommendations for device-agnostic survey designs, and we collaborate with groups across the industry, sharing research-on-research findings and data trends. We’ve pulled these tips together after thorough testing and years of experience conducting online research with clients. Although the focus is on engaging respondents using smartphones, broadly adopting these techniques will improve the effectiveness of your surveys and the quality of your data returns across all devices.
Keen to know more? Speak to our award-winning survey design team to learn how you can enhance your data collection from all devices. For more learning opportunities, try our Survey Design Training Modules.
Like these research tips?
Subscribe to receive a monthly research tip on online survey design, sampling, data integration and more - in addition to updates when new Survey Design Training Modules are released.