IBM’s e-business Innovation Centre in Toronto ran an online business-to-business survey between 2000 and 2001, and after tweaking a number of variables they managed to double their response rates in the second year of running the survey.
Here are some of the lessons they learnt:
- Survey length – Kept the survey to a minimum. In 2000 the survey was 37 questions. For 2001, we condensed the survey to 20 – 24 questions, and greatly reduced the number of mouse clicks required to complete the survey.
- Be honest about how long the survey will really take – Our first draft survey would have taken much longer than our promised 10 minutes to complete. We did not want to annoy the respondent (and so affect our credibility and response rates), so we shorted the survey to ensure it would take no more than 10 minutes we promised it would.
- Provide value, value, value – In our experience, online survey response rates increase dramatically when the participant gains value from responding. For the 2001 survey, we identified multiple and relevant value for responding to the survey. We offered a copy of our final results, additional learnings on executions, and added a contest component as additional incentive.
- Send the survey mid-week, during mid-afternoon – Most e-mail users will start their Monday mornings cleansing their mailboxes of non-corporate or personal emails. The likelihood of your email being read is increased by sending out e-mail invitations mid-week, after 12pm. Other e-mail marketing strategies (e.g. sender and subject line testing) can contribute to higher response rates.
- Use 1 reminder e-mail to the survey invitation – Standard to an online survey execution is sending out one reminder email to the survey invitations. Our reminder email generated 15% more responses.
- For our target market, fax invitations were not found to be effective – Out of 1811 fax invitations, we received only 32 responses or a 1.8% response rate.
- Allow for some open ended questions – Allow customers the opportunity to provide some open-ended answers instead of answering just “other”. It can be disappointing at the close of a survey to discover very high “other” response. This indicates that there is an insight that has not been presented in the options provided in the closed question format. To counter this from happening, we added some open-ended questions whereby the respondent could articulate what the “other” answer meant. For report writing, these open-ended responses can be used to confirm or articulate a finding.