Select Campus Evaluation Support

Roudy Hildreth, Associate Director, CU Engage
CU Boulder School of Education, Faculty Affiliate, Educational Foundations, Policy & Practice, School of Education Puksta Scholars Coordinator

Prior to joining CU Engage in January 2015, Roudy served as Associate Professor and Distinguished Teacher of Political Science at Southern Illinois University Carbondale, where he also co-directed the Center for Service-Learning and Volunteerism. Roudy has published numerous scholarly articles and book chapters on topics areas such as community-based pedagogy, democratic theory, the political philosophy of John Dewey, youth civic engagement, and qualitative research. In addition, Roudy has worked with the Public Achievement program nationally and internationally since 1996.

CU Resources Online

William (Bill) Penuel, CU Boulder Learning Sciences Professor, crafted and taught a free online course on evaluation. The course provides rich examples and resources, and addresses evaluation through the exploration of practical issues in youth serving programs and other contexts. Bill has conducted dozens of small- and large-scale evaluations of projects in education in the domains of mathematics, science, technology, and literacy. His expertise is relevant to evaluation in the areas of study design, assessment design and implementation research. He is on the editorial board for the American Journal of Evaluation.

CU System: Network Evaluation and Research. The PARTNER team, led by Professor Danielle Varda at CU Denver, provides facilitation, evaluation, analysis, and tool building for assessment. The PARTNER team is skilled in network evaluation, and systems evaluation and research, from an inter-organizational systems perspective. The PARTNER tool collects social network data and provides a tool to analyze these data, including ways to assess attribution, perceptions, agreement and interrelationships. The PARTNER process supports putting data into practice. The team utilizes a community based participatory research approach to work closely with communities when designing, assessing, and analyzing networks. Professional development is provided through network leadership and evaluation trainings and an annual workshop

Additional Resources Online

The 2010 User-Friendly Handbook for Project Evaluation, NSF Division of Research and Learning in Formal and Informal Settings, National Science Foundation. The handbook offers support with formative, implementation, progress and summative evaluation and reflects an emphasis on the inherent interrelationships between evaluation and program implementation. The handbooks is primarily for project directors and principal investigators, with the aim of supporting project personnel to gain the most they can from their evaluation efforts.

From Soft Skills to Hard Data ten youth outcome measurement tools that are appropriate for use in after-school and other settings.

NSF Framework for Evaluating Impacts of Informal Science Education Projects Report. The report shares a framework for summative evaluation, NSF established “impact categories” and an on-line survey tool to help gather information about the impacts of projects. The book is designed to support selection of the most rigorous design appropriate for the work and shares examples evaluation strategies for a range of projects in informal science education. Refer to “How to Read This Book” on p. 16.

Michigan State University – Points of Distinction: A Guidebook for Planning and Evaluating Quality Outreach. Helpful tools to assist academic units, faculty leaders and the higher education community plan, monitor, evaluate and reward outreach efforts. 

National Co-Ordinating Centre for Public Engagement (NCCPE). Provides evaluation resources, examples and tools to assess the impact of your outreach and engagement project.

Select Evaluation Methods

Developmental evaluation. A collaborative, non-traditional approach, developmental evaluation: 1) aims to support innovation through the development of principles that are context dependent rather than models to be replicated or “best practices”, 2) recognizes goals as evolving rather than predetermined, and 3) works to engage and learn about complex interdependencies and connections rather than identify cause and effect relationships.

Empowerment evaluation. Empowerment evaluation can be applied to individuals, organizations, or communities, but the focus is usually on programs. Empowerment evaluation provides tools so that participants can monitor and evaluate their own practice, use this process for continued reflection and improvement, and accomplish their goals.