How DI Listens and Learns from Feedback Posted on: September 13, 2018 អត្ថបទនេះមានជាភាសា៖ ខ្មែរ By Chansocheata POUM Former Innovation Program Assistant Development Innovations At Development Innovations (DI), we embrace a results and learning-oriented work culture. Along the way, we have reshaped and refined our objectives and key activities to make sure we serve the right audiences in the specific areas they need. The project, itself, has improved because we learned from past experiences, listened to feedback and strategically used it as guidance in the design of new activities. As the project finished its fifth year, DI wanted to learn what worked well and what did not, and draw the lessons learned from the DI story. In April 2018, DI engaged two consultants to conduct research to tell DI’s story through a Results and Learning Assessment. They conducted interviews and focus groups with different DI partners and service providers and analyze results and trends. I worked to help these consultants conduct the assessment across Phnom Penh, and learn how these assessments work in real life. When I was first assigned to this project I did not have any idea about what it was. Looking back, I now see a bigger picture of the whole process. I can now see why it is important to give the opportunity to partners, other NGOs, companies and individuals to give feedback on their ideas and learnings. The research project had three main parts: Preparation, Data Collection, and Data Analysis and Reporting. I – Preparation Plan For Your Objective: A thorough preparation will help the work to be done effectively and efficiently. In this stage, we identified the desired outcome and made a strategic plan on how to achieve it. DI’s team took two weeks to work on this prior to the arrival of the consultants. DI also shared with the consultants the relevant documents, like evaluation reports, for them to become familiar with DI’s work and the Cambodian context. And of course, the meeting arrangements. The DI team identified partners from different sectors they had worked with since the beginning of project for in-depth interviews and planned a schedule for the consultants to meet with them. Involve Everyone: Preparation was challenging since the consultants were not familiar with Cambodian context in the beginning. The interview protocol needed to reflect the sectoral differences of partners. The project staff needed to articulately inform the consultants who might not understand the Cambodian context. II – Data-Collection I observed four important elements of the process that drove the desirable findings are: Choose the right person to conduct the interview: To make sure we were able to gather unbiased data, we avoided sending members of the DI team to conduct the interviews. By sending the external consultants and me, a Young Innovator who had only been with DI for just a few months, we collected relieved our interviewees from the pressure of responding to partners and colleagues so that we could get the most authentic answers from them. Be flexible: Fixed interview scripts serve as a barrier to getting a truthful answer from interviewees. Starting a conversation with introductions from both party helps release the tension or pressure on interviewees. The consultants asked questions differently, depending on the personality and engagement of the interviewee. One thing I found that worked really well was letting the interviewee give their views on a broad topic first, after which we could ask follow-up questions, referring back to things that they had mentioned. In many cases, we ended up coming out of an interview with findings and insights that surprised us. Be positive and open minded: As interviewers, we have to keep in mind that interviewees’ responses might not always something we wanted to hear, but rather, needed to hear in order to write a true story. When gathering answers that were negative, or discussing the topic of failures, the consultants always tried to get the lesson learned from the interviewees. “We want to know what happened. Everything that happened, not just the successes. We also learn from what didn’t work,” described DI’s Chief of Party, Kate Heuisler. Seek clarity: Never assume or put words in peoples’ mouths! If the answer is not clear, we asked again and again. Instead of asking “Do you think DI has contributed a lot to this great success?”, ask “How did DI contribute to this success from your perspective?” Sometimes I would have to jump in to help translate between the consultants and local partners. If things remain unclear, we would ask them to give specific examples or provide further elaboration. To ensure the quality of the data, we confirmed the information precisely before writing it down in the meeting notes. III – Data-Analysis and Report Look for the common themes: After collecting the data, the consultants started analyzing the data by clustering the responses and identifying themes. Starting with looking for the patterns from the interviews and categorizing the types of interviewees by their engagement with DI. Grantee? Service Provider? Beneficiary? Finding the trend of what worked well, and what did not, across the different types of partners. From there we drew key results, factors for success, and lessons learned. It was an abstract process that required a lot of insight and understanding each initiative that partners had worked with DI on. Use the Research! Working on this process was a great opportunity for me to learn about DI’s work from a different perspective: our partners. More importantly, I was able to observe and learn from the experts who do this kind of work all the time: the consultants. In May 2018, when DI held our Strategic Team Building session, we got to see the findings from this study. The key results and success factors that we learned from this assessment are now at the center of how we design our next year’s program activities. Results, success factors and lessons learned are at the core of DI’s work; practices DI intends to keep at the center of the design process in the future.