ACSA Partner Content

Step-by-Step Guide to Program Evaluations

The Association of California School Administrators is the largest umbrella organization for school leaders in the United States, serving more than 17,000 California educators.

Issue link: https://content.acsa.org/i/1199472

Contents of this Issue

Navigation

Page 0 of 0

P 202.559.0050 www.hanoverresearch.com @hanoverresearch In order to maximize investments in programs, resources, and policies, school districts must con nuously examine how these ini a ves are impac ng student learning and iden fy opportuni es for improvement. District leaders can use this Step-by-Step Guide to Program Evaluations as a tool to assess the district's readiness to complete a program evalua on. © 2018 Hanover Research K12IG0418 For more information regarding our services, contact info@hanoverresearch.com S T A G E 1 : P R I O R I T I Z A T I O N 1. Create a list of major programs and initiatives currently implemented. q Not Complete q Somewhat Complete q Complete q Unsure 2. Categorize programs and initiatives based on target outcomes, served popula on, and/or focus areas. q Not Complete q Somewhat Complete q Complete q Unsure 3. Establish criteria (e.g., state mandates, relevance to strategic plan) for determining which programs and ini a ves will be evaluated. q Not Complete q Somewhat Complete q Complete q Unsure 4. Remove programs and initiatives that are low-priority, small in scope, or "hot topics" (i.e., those with a ributes that will prevent meaningful change) from considera on. q Not Complete q Somewhat Complete q Complete q Unsure 5. Choose programs and initiatives to evaluate that strongly align to strategic goals, reach large popula ons, and/or are resource-intensive. q Not Complete q Somewhat Complete q Complete q Unsure 6. Identify outcomes to measure in the evalua on. q Not Complete q Somewhat Complete q Complete q Unsure WHAT ARE OUR OBJECTIVES FOR THIS PROCESS? HOW CAN WE BETTER ALIGN PROGRAMS TO OUR STRATEGIC GOALS? WHAT TARGET OUTCOMES CAN BE MEASURED IN THE EVALUATION? S T A G E 2 : P L A N N I N G 7. Build staff and organizational capacity to perform effec ve and accurate evalua ons. q Not Complete q Somewhat Complete q Complete q Unsure 8. Promote stakeholder buy-in and engage relevant stakeholders to help support planning and evalua on q Not Complete q Somewhat Complete q Complete q Unsure 9. Increase familiarity with program evaluation standards— such as those published by the Joint Commi ee on Standards for Educa onal Evalua on (JCSEE)—to guide planning. q Not Complete q Somewhat Complete q Complete q Unsure 10. Set goals for the program evalua on process. q Not Complete q Somewhat Complete q Complete q Unsure 11. Create a logic model for the evalua on's expected outcomes. q Not Complete q Somewhat Complete q Complete q Unsure 12. Design evaluation protocols based on logic models. q Not Complete q Somewhat Complete q Complete q Unsure 13. Select multiple relevant instruments and methods to collect and analyze data. q Not Complete q Somewhat Complete q Complete q Unsure 14. Determine a timeline to complete the evalua on, including checkpoints to collect forma ve results (if necessary). q Not Complete q Somewhat Complete q Complete q Unsure WHAT ARE OUR GOALS FOR THE EVALUATION? HOW CAN WE GET STAKEHOLDER BUY-IN? WHEN WILL WE NEED TO USE THE RESULTS OF OUR EVALUATION? S T A G E 3 : E V A L U A T I O N 16. Collect and synthesize data via mul ple instruments and methods. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 17. Analyze data to determine outcomes resul ng from the program or ini a ve. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 18. Develop findings based on analyzed data. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 19. Communicate findings to program administrators and school and district leadership. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 20. Communicate findings broadly to other relevant stakeholders. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 21. Create an action plan based on program evalua on findings. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 22. Implement the established action plan to improve program func onality or replace ineffec ve programming with an alterna ve. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure 23. Communicate additional findings after implementing the action plan to program administrators and school and district leadership. q Not at All Prepared q Somewhat Prepared q Prepared q Unsure WHAT CAN WE LEARN FROM THE DATA? HOW WILL WE COMMUNICATE OUR FINDINGS TO STAKEHOLDERS? WHAT ARE OUR NEXT STEPS FOR PROGRAM IMPROVEMENT? PLANNING TOOL STEP-BY-STEP GUIDE TO PROGRAM EVALUATIONS Sources: h ps://www2.ed.gov/about/offices/list/oese/sst/evalua onma ers.pdf; h p://www.eval.org/p/cm/ld/fid=103; h p://sdp.cepr.harvard.edu/files/cepr-sdp/files/program_evalua on. pdf?m=145011180; h ps://www.cdc.gov/eval/framework/index.htm; h p://mps.milwaukee.k12.wi.us/MPS-English/CIO/Research--Development/LogicModelingHandbook.pdf; h ps://www. educa onworld.com/a_curr/school-program-evalua on-basics.shtml; h ps://www.educa onworld.com/sites/default/files/GPP-Evalua on-Worksheet.pdf; h ps://www.energy.gov/eere/analysis/ program-evalua on-why-what-and-when-evaluate; h p://www.dmeforpeace.org/sites/default/files/Volkov%20and%20King_Checklist%20for%20Building%20Organiza onal%20Evalua on%20 Capacity.pdf; h p://www.cura.umn.edu/sites/cura.advantagelabs.com/files/publica ons/35-3-King-Volkov.pdf

Articles in this issue

Links on this page

view archives of ACSA Partner Content - Step-by-Step Guide to Program Evaluations