Meet Our #OpenForGood Award Winner: An Interview with Lee Alexander Risby, Head of Effective Philanthropy & Savi Mull, Senior Evaluation Manager, C&A Foundation

1

Lee Alexander Risby

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

C&A Foundation is a European foundation that supports programs and initiatives to transform fashion into a fair and sustainable industry that enables everyone – from farmer to factory worker – to thrive. In this interview, Lee Alexander Risby and Savi Mull share insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the C&A Foundation and how this practice has helped you to advance your work?

2

Savi Mull

Savi Mull: For almost five years, C&A Foundation has been dedicated to transforming the fashion industry into a force for good. A large part of that work includes instilling transparency and accountability in supply chains across the industry. From the start, we also wanted to lead by example by being transparent and accountable as an organization, sharing what we were learning whilst on this journey, being true to our work and helping the rest of the industry learn from our successes and failures.

Lee Alexander Risby: Indeed, from the beginning, we made a commitment to be open about our results and lessons by publishing evaluations on our website and dashboards in our Annual Reports. After all, you cannot encourage the fashion industry to be transparent and accountable and not live by the same principles yourself. Importantly, our commitment to transparency has always been championed both by our Executive Director and our Board.

Savi: To do this, over the years we have put many processes in place.  For example, internally we use after-action reviews to gather lessons from our initiatives and allow our teams to discuss honestly what could have been done better in that program or partnership.  We also do third party, external evaluations of our initiatives, sharing the reports and lessons learned. This helps us and our partners to learn, and it informs initiatives and strategies going forward.

The Role of Evaluation Inside Foundations

GP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

SM: I believe it is essential to have this type of function in a foundation to drive formal learning from and within programs. But at the same time, it is an ongoing process that cannot be driven by one function alone. All staff needs to be responsible for the learning that makes philanthropy effective – not just evaluators.

LAR: To begin, we were deliberate in building a team of evaluation professionals to promote accountable learning. We started hiring slowly and built the team over time. What I looked for with each new member of the team, and I am always looking for, is an evaluator with more than just skills, they also need the influencing, listening, communication and negotiating skills to help others learn. Evaluations have little effect without good internal and external communication.

”For us, it was important to be a critical friend, listener, and enabler of learning and not the police.”

The evaluation function itself has also evolved over the last five years. It started off as a monitoring, evaluation and learning function (MEL) and is now Effective Philanthropy. From the start, the function was as not set up as an independent department but created to help programmatic teams in the design of appropriate monitoring and evaluation for the programs, and facilitators and advisors on strategy. However, it has not always been a straight-forward process from the inside. In the first years, we had to spend a lot of time explaining and persuading staff of the need for evaluation, transparency and learning and the benefits of doing so. We wanted to avoid a strong independent evaluation function as that can reduce learning by placing too much emphasis on accountability. For us, it was important to be a critical friend, listener, and enabler of learning and not the police.

SM: So, the first bit of advice is that evaluators should be supportive listeners, assisting programmatic teams throughout the design and implementation phases to get the best results possible. They should not come in just at the end of an initiative to do an evaluation.

LAR: The second piece of advice is on positioning, support, and structure of evaluation within a foundation.  Firstly, it is critical to have is to have the buy-in of the leadership and board for both evaluation and transparency. And secondly, the evaluation function must be part of the management team and report to the CEO or Executive Director. This gives reporting and learning the appropriate support structure and importance.

The third piece of advice is to consider not creating an evaluation function, but an effective philanthropy function. Evaluation is done for learning, and learning drives effectiveness in grant-making for better results and long-term impacts on systems.

SM: The final piece of advice is to take guidance from others outside your organization. The whole team has consulted broadly with former colleagues and mentors from across the evaluation community as well as experienced philanthropic professionals. Remember you are part of a field with peers whose knowledge and experience can help guide you.

Opening Up Pain Points

GP: One of the reasons the committee selected C&A Foundation to receive the award is because of your institutional comfort level with sharing not just successes, but also being very forthright about what didn’t work. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected C&A Foundation’s reputation in the field, and why it’s worth it?

LAR: I would say this. The question for foundation boards and leaders is straightforward: do you want to be more effective and have an impact? The answer to that will always be yes, but it is dependent on learning and sharing across the organization and with others. If we do not share evaluations, research or experiences, we do not learn from each other and we cannot be effective in our philanthropic endeavors.

"There is a benefit to being open, you build trust and integrity – success and failure is part of all of us."

The other question for boards and leaders is: who does philanthropy serve? For us, we want to transform the fashion industry, which is made up of cotton farmers, workers in spinning mills and cut and sew factories, consumers and entrepreneurs, to name a few – they are our public. As such we have the duty to be transparent to the public about where we are succeeding and where we have failed and how we can improve. We do not think there is a reputation risk. In fact, there is a benefit to being open, you build trust and integrity – success and failure is part of all of us.

SM: Adding to what Lee has said, being open about our failures not only helps us but the entire field. Some of our partners have felt reticent about our publishing evaluations, but we always reassure them and stress from the beginning of an evaluation process that it is an opportunity to understand how to they can improve their work and how we can improve our partnership, as well as a chance to share those lessons more broadly.

Learning While Lean

GP: Given the lean philanthropy staffing structures in place at many corporate foundations, do you have any advice for your peers on how those without a dedicated evaluation team might still be able to take some small steps to sharing what they are learning?

SM: Learning is a continuous process. In the absence of staff dedicated to evaluation, take baby steps within your power, such as implementing after-action reviews, holding thematic webinars, or doing quick summaries of lessons from grants and/or existing evaluations from others. If the organization’s leadership endorses learning, these small steps are a good place to start.

GP: And speaking of lean staffing structures, a concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does C&A Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

SM: The foundation has a Monitoring and Evaluation Policy that lays out the role of the programmatic staff and partners as well as of the dedicated Effective Philanthropy Team. C&A Foundation partners are generally responsible for the design and execution of self-evaluation - to be submitted at the end of the grant period. External evaluation budgets are covered by the foundation and do not pose a financial burden on partners at all. They are included in the overall cost of an initiative, and when needed we have an additional central evaluation fund that is used to respond to the programmatic team’s and partner’s ad hoc demands for evaluations and learning.

The Effective Philanthropy team does provide technical assistance to partners and foundation staff upon request. The guidance ranges from technical inputs related to the theory of change development to the design of baseline and mid-line data collection exercises. The theory of change work has been really rewarding for partners and ourselves. We all enjoy that part of the work.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your work.

Learning Leads to Effectiveness

C-a-foundation (1)LAR: In the moving from a more traditional MEL approach to effective philanthropy we looked at the work of other foundations. This included learning from the William and Flora Hewlett Foundation, the Rockefeller Foundation, and others. We had discussions with a number of peers in the field. We also asked Nancy MacPherson (formerly Managing Director of Evaluation at Rockefeller) and Fay Twersky (Director of Effective Philanthropy at Hewlett) to review our Effective Philanthropy strategy when it was under development. Their feedback and advice helped a lot. In the end, we decided to begin to build out the function in a similar way to the Hewlett Foundation. But there are some differences. For example, our evaluation practice is currently positioned at a deeper initiative level, which is related to the field context where there is a significant evidence gap across the fashion industry that needs to be filled. Concomitant to this is our emphasis on piloting and testing and that goes hand-in-hand with the demand for evaluative thinking, reporting, and learning.

Our team has also been influenced by our own successes and failures from previous roles. That has also inspired us to embrace a slightly different approach.

SM: In terms of where we are at the moment, we still oversee performance monitoring, evaluation, and support to the program teams in developing theories of change and KPIs; but we are also building out organizational learning approach and are in the process of hiring a Senior Learning Manager. Lastly, we are piloting our organizational and network effectiveness in Brazil, which is being led by a colleague who joined the foundation last year.

LAR: We are also in the midst of an Overall Effectiveness Evaluation (OEE) of C&A Foundation’s first 5-year strategy. In general, this is not a type of evaluation that foundations use much. As well as looking at results, the evaluators are evaluating the whole organization, including Effective Philanthropy. For me as an evaluator, it has been really rewarding to be on the other side of a good question.

We are learning from the OEE as we go along and we decided to create ongoing opportunities for reporting/feedback from the process rather than waiting until the very end for a report. This means that program staff can be engaged in proactive discussions about performance and emerging lessons in a timely way. The OEE is already starting to play a vital role to inform the development of the next 5-year strategy and our organization. But you will surely hear more on that evaluation process later as it will be published. There is always room for improvement and learning never stops.

--Lee Alexander Risby and Savi Mull

About the author(s)