For several reasons, I suspect that this is the “leading edge” of MLE for social change, rather than a reflection of MLE practice more generally. (A principal reason is that fully collaborative/participatory processes generally are more expensive than traditional evaluations.) But it is a “signal of change” worth watching.
I note this trend here, because most of what is reported in this document is not advocacy/policy change “evaluation” as we would have discussed it 10 years ago – or even evaluation capacity building. Instead, these presentation snapshots reflect evaluators’ more consistent and ongoing engagement with project implementers – much as might occur in business, military, and similar arenas, where the tactics to be assessed are complex and rapidly changing.
Measurement and learning in nonprofit organizations
(1) Leadership must model strategic use of data and analysis if they want staff to do so. (Presentation by internal evaluator at large nonprofit service provider)
Nonprofit leadership must model and communicate to staff the importance of gathering and analyzing data in making strategic decisions if it wants its staff to make use of measurement and learning resources in their own work. Leadership can improve its own decision making while at the same time “creating demand” for MLE by calling for data-driven reports and incorporating MLE into staff management and training. Examples of incorporating MLE into staff management included devising training about MLE use and usefulness in staff onboarding; establishing an expectation that MLE staff are consulted early on in program/project development (just as communications and advancement staff are); and incorporating MLE expectations into performance reviews for team leaders.
(2) To make the most of an organization’s measurement and learning, it’s important to train staff to be effective “consumers” of MLE, and of MLE staff expertise. (Presentation by internal museum evaluator/external museum evaluation consultant)
Organizations (and MLE experts) can’t assume that organizational staff will know how to use data and analysis to improve their strategies and tactics. Or that they will know how to work with MLE staff most effectively. Staff should be supported in learning how MLE is used in organizations like theirs, and the benefits of doing so. They should also learn how to use evaluative thinking to design strong questions: Identifying strategy-moving questions is a critical starting point for any MLE work, and only people intimately involved with the program can do this well. Once good questions are in place, internal or external MLE experts can help staff design the most effective ways to gather and analyze strategy-relevant data.
Supporting advocacy movements
(3) Networks are essential to build movements. They need careful nurturing. (Presentation by internal evaluator for an international human rights funder)
Networks are empirically correlated with social movement success, especially in places with a high level of political repression. In such environments, external funding can be divisive and is almost always seen with suspicion. It’s important to build longstanding relationships of trust, and to avoid as much as possible disrupting network power dynamics.
(4) Movements require patient capital. (Presentation by internal evaluator for an international human rights funder and its government-sector funding partner)
Human rights organizing and advocacy is often “two steps forward and one step back.” It’s important to clarify expectations with foundation trustees so the funder and the program implementers are aligned on realistic expectations about the pace of change.
Building capacity: Individual, organizational
Capacity building (known by different terms in different programs) was an important topic at #AEA23.
(5) Leverage social capital when supporting people to build their individual capacities. (Presentation by a community foundation’s internal evaluation manager)
Supporters of traditionally marginalized populations should think beyond financial resources when supporting them to achieve success. One foundation in Michigan supports young Black men, starting in middle school, to achieve educational excellence. Many of these youth are the first in their families to apply to college. In addition to providing scholarship dollars, the foundation connects them with college students who are trained to provide practical advice, mentorship, and fellowship that these aspiring college students might lack. The foundation also provides emergency funds, so that the lack of a computer or course book does not become an insuperable barrier to educational success.
(6) People being served should participate in planning and evaluating organizational capacity building.
Planning. (Presentation by international human rights funder’s internal evaluation manager)
One organization that funds capacity building in South Asia uses an approach where an advisory team of country-level implementers/grantees is supported, through a facilitated process, to identify important gaps in skills and knowledge that grantees face. The advisory team then advises on capacity-building topics to be covered in a grantee convening. The advisory team consults with the funder to identify grantee participants for the convening (together with in-country funder staff, who are former implementers themselves). In addition to measurement and learning sessions, the convening incorporates “white space” for grantees to develop/undertake strategy conversations, which enhances the value of the convening to grantees.
Evaluating. (Presentation by an international development funder and its external evaluator)
The evaluators of a social change capacity building initiative (implemented in dozens of sites across the world) uses a storytelling/ narrative technique to elicit evaluative information from beneficiaries of the program. Storytelling/narrative has had several benefits, according to the external evaluators. First, telling stories about the program is familiar and comfortable for a beneficiary group that might otherwise be intimidated by formal interviews conducted by evaluation “experts.” Second, guiding participants to analyze their own experiences permits participants to place their program reflections in the context/environment as they see it – including what happened before the funded program, and afterwards. It also sheds light on how personal and organizational networks are strengthened through the program.
This project utilizes an advisory team of grantees as a resource and a trust-building mechanism to provide input into the design of the interview guide, select and make introductions to the interviewees, and review stories and give feedback. At the early stage of this evaluation being described at #AEA23, stories have been valuable to foundation staff in preparing for site visits (in addition to providing data for the larger-scale evaluation).
Final note: The conference presenters whose work is noted on this page observed that collaborative /participatory approaches take time, including additional time for evaluators to analyze and write up data.