Monitoring and Evaluation

ActKnowledge’s evaluation practice includes specialties in coalitions and advocacy, community schools, and public space.  In ActKnowledge’s Theory of Change-based methodology, evaluation forms the cornerstone of an organization’s ability to track progress on its outcomes. For specific examples, see Clients and Publications.

We approach our evaluation work with the following principles and assumptions:

1.     Methodology and Design:  In most cases we use a participatory, mixed-methods approach. Our theory-based evaluation methodology leads participants in thinking about and measuring outcomes. Viewed through the lens of scientific method and critical analysis dedicated to understanding what works and what doesn’t, and why, the grounded experiences of participants become systemized and amplified into meaningful and objective lessons. ActKnowledge’s expertise ensures that the best measures are used and implemented in ways that will provide the most valid and reliable data. For scientific rigor, we triangulate methods to guard against error.  We collaboratively develop a rich balance of qualitative and quantitative measures appropriate to the needs of the evaluation.

 

2.     Data Collection and Analysis: We believe that those closest to the work have key roles to play in defining their impact, and that their voices matter in the larger framework of policy change and social movement-building. We work to involve research participants in conducting the research–having students, for example, interview one another; or staff participating in data analysis–without loss of rigor. We aim to build a reflective culture within the organizations we work with through our participatory approach that helps them to understand the value of their work.

 

3.     Findings: Our methodological approach aligns findings with analysis: findings must be supported by the data, and sufficient quantitative data are presented to substantiate a qualitative conclusion.  We recognize explicitly that no research can be free of the values the researchers and other stakeholders bring to the task. We therefore take care both to articulate the values from which we operate and to employ rigorously scientific methods.

 

4.     Recommendations: We feel that the success of an evaluation can be measured according to its utility in serving specific and intended purposes—for example, to inform program strategy and development decisions, make a case for support, provide lessons learned and best practices to the field, and add to the knowledge base of what works.  Recommendations give much value toward this end; balanced, insightful recommendations help audiences connect the data and findings with the issues they face and the decisions to be made.