At GlobalGiving, we believe that collecting and using feedback from constituents is an important part of the development process. We have been working to discover tools and practices that make it easy for nonprofits to listen to those voices. One of the tools we’re exploring is the Net Promoter System.
You may have heard us talk about to the Net Promoter System (NPS) before. You have probably even answered the NPS question before! The Net Promoter System) is a tool that was developed by Bain & Company for the for-profit sector to measure how loyal a customer is to a certain company- and consequently serves as an indicator of future profit. The NPS score is based on a 0-10 scale response to the following question: How likely is it that you would recommend our company/product/service to a friend or colleague? This question divides customers into three categories: Promoters (who give a score of 9 or 10), Passives (who give a score of 7 or 8), and Detractors (who give a score of 0-6). The score is then determined by subtracting the percentage of Detractors from the percentage of Promoters. Passives are ignored in the equation.
The score is useful when used in two distinct ways: when compared to other companies within a specific industry (which gives you a basic ranking system), but more powerfully when companies use the feedback provided in the common follow up question (“Tell us more about why you chose this score”) to make changes which move more of their customers into the Promoters bucket.
The NPS has been used for over a decade in the for-profit sector, and is considered by many to be the single most useful question a company can ask its customers.
Similarly, we think this could be a powerful, inexpensive, and easy tool for nonprofits to use to understand how their ‘customers’ feel about the work they’re doing. We wanted to test this by asking this question to via a short SMS survey to a group of constituents served by GlobalGiving nonprofit partners.
Designing the Pilot
In June 2014, we began exploring what a pilot experiment around NPS via an SMS survey would look like. We decided to use the FrontlineSMS platform to send out the surveys. We then needed to decide where and with whom it would make most sense to run this pilot. We considered several factors, such as familiarity with SMS surveys, mobile use among the targeted population, and whether potential GlobalGiving partners were already using digital communications with those they serve.
In October 2014, after running an application process with Philippines-based GlobalGiving nonprofit partners, we selected three organizations to help us run this pilot: Mercy in Action, HOST-NGO, and International Disaster Volunteers. One of the reasons we chose to run this pilot in the Philippines was that one of our staff members would be traveling there for site visits and could provide in-person support before the launch of the surveys.
What We Did
Mercy in Action, HOST, and IDV provided us with approximately 400 phone numbers from people who had recently taken part in their services. Each of these organizations also helped us translate the following survey questions into the local languages most appropriate for their constituents:
- Hello from [insert organization name]! Will you take a satisfaction survey on our services? All answers will remain anonymous. Please respond Yes or No.
- On a scale of 0-10, 0 NEVER and 10 ALWAYS, how likely are you to recommend us to your family and community?
- Thank you! Your response is anonymous. Please explain your answer.
- Thank you for your feedback!
We used the FrontlineSMS platform to send these SMS surveys. We chose to send the survey at 5pm local time, which generally garners the highest response rate for SMS surveys. We used a survey logic function that allowed respondents to opt of the survey, or sends them the next question based on their answer.
How It Turned Out
While we were able to successfully send out the survey to nearly 400 recipients (people identified by our partner nonprofits in the Philippines), we had a much lower response rate than we hoped. We have a few hypotheses of why this might be:
- Due to technical issues getting the SMS platform settings correct, we were not able to send the survey out immediately. We think the delay between accessing services and receiving a survey on those services contributed to a lower response rate.
- We had not given our nonprofit partners good materials for informing their constituents about the surveys. While all three of the partners in this pilot already do a great job of collecting and responding to feedback from their constituents, we now better understand the importance of educating the recipient group when implementing a new kind of survey. In this case, the unfamiliarity of receiving an SMS survey may have deterred the constituents we contacted from responding.
What We Learned
1. Timeliness of the survey: it is important for the survey to be conducted in very close proximity to the last service or contact. Imagine yourself taking a satisfaction survey on your visit to a store or restaurant immediately after your visit, versus several months later. While you may remember key details from your visit even months later, the fresher the experience, the more accurate the data will be. And the more likely it is that you will take the time to answer the survey.
2. Platform: We used FrontlineSMS to run this survey, and it proved critical that we had developed a good relationship with their support team prior to, during, and after the survey. We ran into a few technical issues along the way, and FrontlineSMS was able to provide timely support as we juggled many moving pieces to launch the survey and collect responses. It turns out that developing and writing a complex SMS survey in several languages is hard!
3. Recipient familiarity with digital communications: We have learned that these types of SMS surveys will likely be most successful when added onto an existing framework of digital communications. This means that if an organization is already regularly sending SMS or email to their constituent group, then it’s best to pair an NPS survey with those regular communications. Otherwise, a more extensive information campaign to their constituents is needed before beginning a new survey. This makes sense- you are more likely to respond to a survey about a service if you know that it’s coming, and if you know how it will be used.
4. Platform flexibility: While we were overall very pleased with the FrontlineSMS platform, there are some inherent stumbling blocks in using an automated survey. In our case, respondents had to use a specific keyword in order to trigger the correct follow up questions. Even though we prompted respondents with the phrase they needed to use (for example, saying “Please respond YES or NO”), misspellings, extra words, or use of a different response with a similar meaning were not recognized by the platform. For example, if the keyword for triggering the next question is “Yes”, phrases like “yeah, yea, sure, yess, ys, yesI’d be happy to” would not be accepted. We found that in many cases, respondents tried participating in the survey, but were unable to to do because the platform didn’t recognize their response.
5. Speaking their language: We ran this survey in four languages – none of which are spoken by the GlobalGiving staff overseeing the pilot. We relied on the help of the participating nonprofit partners to translate the initial questions. However, we may have been better able to coach respondents through the survey (for example, if they were blocked at a specific question because they didn’t use the necessary keyword) if we spoke the language, or had a stronger plan in place for partnering with a translator to provide support.
Our Next Steps
Overall, we found this pilot to be a very valuable learning experience. As mentioned at the beginning of this post, we are trying to discover and hone the best and most useful tools for collecting and responding to feedback. These tools have to be agile enough to be adapted to many situations, simple enough that our nonprofit partners can use them sustainably in even the most resource-constrained settings, and reliable enough that our partners feel comfortable making programmatic changes based on the results.
While there are still many questions surrounding the use of NPS in the nonprofit space, we think there is enormous potential in continuing to experiment with both the content of these surveys, as well as the delivery method. Taking the lessons we learned from this pilot, we are now exploring how we might use different media (email, smartphone apps, etc) in addition to SMS to ask the NPS question. We are looking for opportunities to add the NPS question onto existing digital communications systems, as a way for our nonprofit partners to expand the range of feedback they receive from the constituents. Stay tuned as we design phase two of this project.
Most of all, thank you to Mercy in Action, HOST-NGO, and International Disaster Volunteers for piloting this experiment with us. We applaud their commitment to listening to those they serve and their curiosity in discovering new ways to collect and analyze feedback from their constituents.
By: Sarah Hennessy