Program Evaluation and Performance Measurement

Most frequently, Dr. Raymond works in close collaboration with program staff to develop and implement evaluation and performance measurement plans. This collaborative approach yields increased staff capacity to conduct future evaluation activities. With support and assistance from Dr. Raymond as an “evaluation coach”, program staff can effectively perform many evaluation activities yielding useful information at a reduced cost.

Past evaluations include projects in: community/civic engagement, early childhood, education, health, human services, STEM, visitor studies, and youth development.

Typical Evaluation Process

1. Determine the context for the evaluation.

  • Identify the purpose(s) of the evaluation. (e.g., program improvement, accountability, progress monitoring, assessing processes and/or outcomes, communications)
  • Identify stakeholder (e.g., clients, funders, Board, staff) interests, expectations, or mandates in regards to the evaluation.
  • Discuss available resources and constraints in conducting the evaluation.
  • Discuss who will be involved in the evaluation and in what roles.
  • Discuss how the information from the evaluation will be used and communicated.

2. Describe the program.

  • Description of need and program context.
  • Description of the program conceptual theory, assumptions, and design.
  • Development of program logic model.

3. Develop the evaluation plan.

Develop evaluation questions to be answered. Common questions include:

  • Who participates in the program?
  • To what extent are program outcome and process objectives achieved? (outputs and outcomes)
  • To what extent are participants and program partners satisfied with the program? (stakeholder satisfaction)
  • To what extent was the program implemented as planned? What changes were made and why? (quality of process)
  • In what ways can the program be improved to further support achievement of program goals and objectives?
  • Specify desired outputs and outcomes (include those of interest to clients, funders, Board, staff, regulators, etc.).
  • Develop appropriate objectives/indicators.
  • Determine appropriate data sources, data collection methods, and data management processes, and evaluation timeline (including periodic progress meetings).
  • Develop/modify evaluation instruments, as needed.

4. Collect data – including training staff/volunteer data collectors, if needed.

5. Analyze data and interpret the results.

6. Communicate the results – including presentations and reports.

7. Make program modifications – if indicated by evaluation results.

For a complimentary consultation, contact Dr. Raymond at 305-774-7056 or catherine@raymondconsulting.com