What steps do you take to generate quality data?

How do the Kantar Profiles Network and expert teams approach the key areas of survey data quality?
28 November 2019
gold data
Chris
Stevens

Chief Quality Officer, Profiles Division

Get in touch

Today users of research data and insights are regularly – and maybe unwittingly - confronted by a tradeoff between speed, quality and cost whilst juggling the need to move faster with tighter budgets.

“Quality” covers a broad spectrum and can be seen as an overarching philosophy on how agencies and suppliers approach the piece of work that needs to be done – but for some it is considered as a “hygiene” step – and sometimes there are steps, knowledge and transparency in the process which are lacking. In turn this can have an impact on the reliability of the data. At Kantar we are committed to ensuring robust, actionable data for our clients so they can trust this to inform their decisions.

We speak to Chris Stevens, Chief Research Officer at Kantar, to understand how the Kantar Profiles Network and expert teams approach the key areas of Quality.

Respondent validation and verification

Q: What steps are taken to check the panelists or respondents before they can enter a survey?

A: Kantar uses our high-quality proprietary research panel Lifepoints alongside trusted partner sources who have been thoroughly vetted and tested over time. The Kantar Profiles Network is built through over 20 years of experience and with an Approved Supplier List (ASL) which requires partners to meet key quality requirements for these to be a part of this tailored Network.

Kantar has the ability to check the validity of the respondents across a range of methods:

  1. Various processes in place to actively check identity/location/human data– e.g. IP validation, Captcha check.
  2. Recent investment in AI machine learning logic to score and flag Lifepoints panelists before they participate in surveys.
  3. Honesty Detector (Kantar Proprietary) and in-survey logic and speeding checks to detect poor respondent behaviour across the respondent sources and from the study.
  4. Additional custom quality checks are sometimes implemented at a project level.
  5. As part of the Approved Suppliers List to be able to select from a range of sources e.g. Proprietary research panels, Loyalty programs, Affiliate networks
  6. For verification, where available and optimal, we utilize third party verification suppliers such as Imperium’s Verity program for the US, and other third party in other countries to check physical address. And the IP address helps us to determine that the panelist / respondent resides in the country where the survey is being taken. The advantage of using / managing a panel is that based on the continuous feedback from studies it is much easier to determine whether they are authentic or not. This gives more confidence when running higher risk surveys.
  7. Duplication between suppliers is managed using Imperium Relevant ID solutions.

This multi-faceted, experience-driven approach has been developed with the aim of safeguarding the survey process for our clients. By implementing these measures where available this provides confidence around a set of research-ready respondents for future projects.

Representativity

Q: What are the considerations in reaching key demographics that are balanced. For example, how do you ensure a national or category purchase census distribution for projects when needed?

A: Kantar is open and honest with our clients around the ability for online sampling to produce a representative sample. Online panels do not always perfectly represent the national population, we know this and so are cautious with our local knowledge to select the correct methodology for each case. When we are going to deliver online research in countries where internet penetration is lower – and that could be in certain demographics e.g. lower social class in Brazil - we will be clear about the limitations and work with our clients on the appropriate project plan.

This is one of the key points in the ‘speed, quality, cost’ trade off and Kantar collaborates openly to discuss the pros and cons of the approaches. For example, a 100% online project in Brazil may be right if the research objective is clearly set out, fitting to the needs and that there understanding that proportions of the population will be missed – but this might be to balanced by the speed of fieldwork and the relative cost or the requirement to append or work through other data sets.

Consistency is also imperative. Much of the work we manage, such as trackers, mean it is essential to manage the panels, sources and surveys work to limit false trends in the data. Balancing a transition from offline to online requires careful management that our teams are well versed in and available to help clients through the process and ensure data is stable and consistent over time.

To that end, we frequently refresh our panels through continuous recruitment to our panels to help meet client needs, while maintaining stability of trends in tracking.

Related to selecting the right methodology, in some countries online panels tend to under-represent some audiences (rural, lower income, older) and this must be accounted for at the upfront design. Likewise, Kantar can employ sample management approaches at the balancing, click-through, and complete quota level to enable each sample to fully represent the target of interest.

Survey design

Q: Given the number of people online via mobile devices these days, what steps are in place to make sure that the design of the questionnaire and survey are maximised?

A: Kantar recognise the shift in consumer interaction online and the importance of reaching today’s audience in the right way. A simple rule that we fully advocate is short (<=15mins) and mobile-friendly surveys to enable all respondents to complete on their device of choice. However, if content is truly engaging 20 minutes can be achieved with careful planning.

In recent years we have conducted extensive research-on-research on how and validation which has resulted in the development of our QuestionArts Intelligent Components tools. Our expert teams and these survey design approaches have won 34 awards and are proven to lead to more actionable, reliable data.

Respondent engagement

Q: You mention engagement here, how do you ensure this and how does this impact the quality of respondent data?

A: Simply put, responsive design to optimize the experience for each respondent on a device of their choice is fundamental to engage respondents for honest, thoughtful answers.

Kantar has a set of mobile-first criteria which has been validated with respondent engagement and dropout rates – this includes short surveys, a low number of list attributes that require scrolling, a controlled number of open ends and no traditional grid questions that won’t show on all devices.

This is then maximised when dynamic and thoughtful survey design is implemented. We advise on how to ask questions in a smart, effective, and engaging manner and which question formats to use. When this is applied the quality of data improves substantially, allowing respondents to share openly in an engaging, rewarding environment.

We do have layers of checks in place and can build in custom checks to help with speeding/cheating/hyperactive respondents. We can develop and test survey specific quality checks.

Get in touch