All Collections
Monitoring and Evaluation
FAQs: Remote data collection
What are response rates like with remote data collection?
What are response rates like with remote data collection?
James B. Tidwell avatar
Written by James B. Tidwell
Updated over a week ago

Not everyone you contact to be involved with data collection will participate. With remote data collection, the drop off between those you wish to contact and those who actually complete a survey can vary for a number of reasons:

  1. Coverage bias: Some people may not have access to a mobile phone, so these people are excluded from the sample.

  2. Ineligibles: Some phone numbers may be disconnected and others may be turned off. Other potential respondents may need to travel away from their homes to get network access and so may be unavailable, or may be unwilling to view a message/answer a call.

  3. Excluded: Some reached may be excluded from data collection due to not meeting required characteristics, for example, if you reach a single male in a study targeting adult females.

  4. Refusals: Some respondents may view a message/answer a call and then refuse to participate in the survey.

  5. Partial interviews: Some respondents may terminate the call intentionally or lose connection due to network connectivity or phone battery issues.

Those that do complete the survey are considered ‘complete interviews’. Note that for some modes of data collection, it may be difficult to tell the difference between some of these categories. For example, it may not be possible to separate ‘ineligibles’ who never saw a text message from ‘refusals’ who saw it and chose not to respond.

We can calculate response rate as follows:

This definition focuses on minimising refusals and reducing ineligibles due to challenges with making contact, and thus does not reflect coverage bias. Response rates will vary significantly based on the following factors:

  • The population being studied

  • The number of ways you have to contact a particular respondent

  • How recently the phone numbers were collected

  • The number and times of day/days of the week you try to contact the respondent.

Response rates are important not only to ensure that resources are used efficiently, but because low response rates can bias data.

Below we describe four examples of remote data collection via voice-based approaches using mobile phones in low and middle income countries and the response rates in each setting. Note that exposing people to messages or conducting brief surveys via SMS may result in higher response rates than those described below, but may be more biased due to literacy issues:

  • Ghana - This was an IVR-based, random-digit dialling study conducted prior to the COVID-19 outbreak. The data collection team managed to get 81% of those who were reached via mobile phone to complete at least half of the survey. This was reportedly, not much different to response rates during previous rounds of face-to-face data collection. However, the remote data collection did not reach a representative sample. These refusals combined with the coverage bias described above resulted in fewer women, rural, and older residents responding.

  • Bangladesh - This study was about COVID-19 but used a pre-existing sample population contacted by live interviewers. A range of measures were put in place to encourage high response rates. For example, multiple household phone numbers were collected as well as the phone numbers for neighbours. Three attempts were made to call each number and calls were made at different times of day. Overall response rates of 80-85% were seen.

  • Kenya - This study was about COVID-19 and used an existing list of phone numbers collected by organisations working in the area with data collection by live interviewers. Only one phone number was available per household and phones were only called once this resulted in an overall response rate of 66%.

  • India - This was a random-digit dialing survey using live interviewers in India that took place before the COVID-19 pandemic. 22% of the people who answered the call agreed to participate and completed the survey. However this figure doesn’t include people who did not pick up or phone numbers that were inactive or disconnected.

To increase the response rates as much as possible, consider the following techniques:

  • Budget in time to make multiple attempts to contact each respondent.

  • Try reaching out at different times of day, such as evenings or weekends, when people may be more likely to answer the phone or have time to respond to your survey.

  • Consider testing and perfecting your introduction, especially if reaching out via SMS which many have character limits.

  • Try to enhance credibility by mentioning which organisation is conducting the survey and make sure to explain how the respondent will be helping others if they agree to participate (e.g. helping the government or organisations to understand the needs of people like them).

  • Also mention how long the survey will take and that you respect and appreciate their time.

  • Tell the person that if they are busy, you are happy to try again at another convenient time.

  • Consider asking trusted community members, such as community leaders or health workers, to support the study and encourage people to respond to messages or calls. This may be important if community members do not trust the request for data.

See this guide by Qualtrics for general advice on increasing response rates to phone surveys. The World Bank also has a detailed blog post on this issue here: Mobile Phone Surveys (Part 2): Response, Quality, and Questions.

Want to learn more about remote data collection?

Editor's Note

Authors: Fiona Majorin, Julie Watson and James B. Tidwell
Reviewers: Lauren D’Mello-Guyett, Poonam Trivedi, Tracy Morse, Erica Wetzler, Michael Joseph, Holta Trandafili
Last update: 15.06.2020


Did this answer your question?