You are here

10-Step Evaluation for Training and Performance Improvement
Share

10-Step Evaluation for Training and Performance Improvement



November 2018 | 352 pages | SAGE Publications, Inc
Written with a learning-by-doing approach in mind, 10-Step Evaluation for Training and Performance Improvement gives students actionable instruction for identifying, planning, and implementing a client-based program evaluation. The book introduces readers to multiple evaluation frameworks and uses problem-based learning to guide them through a 10-step evaluation process. As students read the chapters, they produce specific deliverables that culminate in a completed evaluation project.
 
List of Tables
 
List of Figures
 
List of Exhibits
 
Preface
 
About the Author
 
Introduction
Performance Improvement and Evaluation

 
What Is Evaluation?

 
What Is Not Evaluation?

 
How Does Evaluation Compare With Research?

 
Program Evaluation in the HPI Context

 
Evaluation Is Often Neglected

 
Different Evaluation Designs Used in Program Evaluation

 
Descriptive Case Study Type Evaluation Design

 
Frameworks for Conducting Evaluations in the HPI Context

 
The 10-Step Evaluation Procedure

 
Chapter Summary

 
Chapter Discussion

 
 
Chapter 1. Identify an Evaluand (Step 1) and Its Stakeholders (Step 2)
Identify a Performance Improvement Intervention as an Evaluand

 
Use the 5W1H Method to Understand the Intervention Program

 
Ask Why the Intervention Program Was Implemented

 
Check If Program Goals Are Based on Needs

 
Sell Evaluation to the Client

 
Identify Three Groups of Stakeholders

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Identify an Evaluand and Its Stakeholders

 
 
Chapter 2. Identify the Purpose of Evaluation (Step 3)
Differentiate Evaluation From Needs Assessment

 
Gather Information About the Evaluation Purpose

 
Assess Stakeholders’ Needs for the Program and the Evaluation

 
Determine If the Evaluation Is a Formative or Summative Type

 
Determine If the Evaluation Is Goal Based or Goal Free

 
Determine If the Evaluation Is Merit Focused or Worth Focused

 
Keep in Mind Using a System-Focused Evaluation Approach

 
Write an Evaluation Purpose Statement

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Identify the Purpose of Evaluation

 
 
Chapter 3. Assess Evaluation Feasibility and Risk Factors
Incorporate Macro-Level Tasks Into Micro-Level Steps

 
Assess Feasibility of the Evaluation Project

 
List Project Assumptions

 
Estimate Tasks and Time Involving Stakeholders

 
Assess Risk Factors for the Evaluation Project

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Assess Feasibility and Risk Factors

 
 
Chapter 4. Write a Statement of Work
Prepare a Statement of Work for the Evaluation

 
Determine Sections to Be Included in a Statement of Work

 
Develop a Gantt Chart

 
Review a Sample Statement of Work

 
Now, Your Turn—Write a Statement of Work

 
 
Chapter 5. Develop a Program Logic Model (Step 4)
Apply a Theory-Based, If–Then Logic to Developing a Program

 
Review United Way’s Program Outcome Model

 
Review Kellogg Foundation’s Program Logic Model

 
Review Brinkerhoff’s Training Impact Model Compared to the Four-Level Training Evaluation Framework

 
Compare Elements Used in Different Frameworks

 
Develop a Program Logic Model

 
Develop a Training Impact Model

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Develop a Program Logic Model or a Training Impact Model

 
 
Chapter 6. Determine Dimensions and Importance Weighting (Step 5)
Think About Dimensions of the Evaluand to Investigate

 
Start With the Stakeholders’ Needs

 
Relate the Purpose of Evaluation to the Program Logic Model Elements

 
Incorporate Relevant Theoretical Frameworks and Professional Standards

 
Write Dimensional Evaluation Questions

 
Determine Importance Weighting Based on Usage of Dimensional Findings

 
Recognize a Black Box, Gray Box, or Clear Box Evaluation

 
Finalize the Number of Dimensions

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Determine Dimensions and Importance Weighting

 
 
Chapter 7. Determine Data Collection Methods (Step 6)
Determine Evaluation Designs for Dimensional Evaluations

 
Select Data Collection Methods That Allow Direct Measures of Dimensions

 
Apply Critical Multiplism

 
Triangulate Multiple Sets of Data

 
Select Appropriate Methods When Using the Four-Level Training Evaluation Model

 
Select Appropriate Methods When Using Brinkerhoff’s Success Case Method

 
Review an Example of Data Collection Methods

 
Use an Iterative Design Approach

 
Assess Feasibility and Risk Factors Again

 
Conduct Formative Meta-Evaluations

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Determine Data Collection Methods

 
 
Chapter 8. Write an Evaluation Proposal and Get Approval
Determine Sections to Be Included in an Evaluation Proposal

 
Review a Sample Evaluation Proposal

 
Now, Your Turn—Write an Evaluation Proposal

 
 
Chapter 9. Develop Data Collection Instruments I—Self-Administered Surveys (Step 7)
Comply With IRB Requirements

 
Use Informed Consent Forms

 
Determine Materials to Be Developed for Different Data Collection Methods

 
Distinguish Anonymity From Confidentiality

 
Develop Materials for Conducting Self-Administered Surveys

 
Determine Whether to Use Closed-Ended Questions, Open-Ended Questions, or Both

 
Ask Specific Questions That Measure the Quality of a Dimension

 
Design Survey Items Using a Question or Statement Format

 
Recognize Nominal, Ordinal, Interval, and Ratio Scales

 
Decide Whether to Include or Omit a Midpoint in the Likert Scale

 
Decide Whether to Use Ascending or Descending Order of the Likert Scale Options

 
Follow Other Guidelines for Developing Survey Items

 
Develop Survey Items That Measure a Construct

 
Test Validity and Reliability of a Survey Instrument

 
Conduct Formative Meta-Evaluations

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Develop Survey Instruments

 
 
Chapter 10. Develop Data Collection Instruments II—Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests (Step 7)
Determine Whether to Use a Structured, Unstructured, or Semi-Structured Interview

 
Develop Materials for Conducting Interviews or Focus Groups

 
Solicit Interview Volunteers at the End of a Self-Administered Web-Based Survey

 
Develop Materials for Conducting Observations

 
Develop Materials for Conducting Extant Data Reviews

 
Develop Materials for Administering Tests

 
Conduct Formative Meta-Evaluations

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Develop Instruments for Conducting Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests

 
 
Chapter 11. Collect Data (Step 8)
Follow Professional and Ethical Guidelines

 
What Would You Do?

 
Use Strategies to Collect Data Successfully and Ethically

 
Use Strategies When Collecting Data From Self-Administered Surveys

 
Use Strategies When Collecting Data From Interviews and Focus Groups

 
Use Strategies When Collecting Data From Observations and Tests

 
Use Strategies to Ensure Anonymity or Confidentiality of Data

 
Conduct Formative Meta-Evaluations

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Collect Data

 
 
Chapter 12. Analyze Data With Rubrics (Step 9)
Use Evidence-Based Practice

 
Keep in Mind: Evaluation = Measurement + Valuation With Rubrics

 
Apply the Same or Different Weighting to the Multiple Sets of Data

 
Analyze Structured Survey Data With Rubrics

 
Analyze Unstructured Survey or Interview Data With Rubrics

 
Analyze Semi-Structured Survey or Interview Data With Rubrics

 
Analyze Data Obtained From Observations, Extant Data Reviews, and Tests With Rubrics

 
Determine the Number of Levels and Labels for Rubrics

 
Triangulate Results Obtained From Multiple Sources for Each Dimension

 
Conduct Formative Meta-Evaluations

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Analyze Data With Rubrics

 
 
Chapter 13. Draw Conclusions (Step 10)
Revisit Formative or Summative Use of Evaluation Findings

 
Develop a Synthesis Rubric

 
Draw Evidence-Based Conclusions and Recommendations

 
Conduct Formative Meta-Evaluations

 
Chapter Summary

 
Chapter Discussion

 
Now, Your Turn—Draw Conclusions and Make Recommendations

 
 
Chapter 14. Write a Final Report and Conduct a Summative Meta-Evaluation
Extend the Evaluation Proposal to a Final Report

 
Present Dimensional Results in the Evaluation Results Section

 
Present Supporting Information in Appendices

 
Present Conclusions

 
Report the Findings Ethically

 
Conduct a Summative Meta-Evaluation

 
Report Limitations

 
Write an Executive Summary

 
Present the Final Report to Stakeholders

 
Follow Up With Stakeholders

 
Present Complete Sections in a Final Report

 
Now, Your Turn—Write a Final Report

 
 
Appendix A. A Summary of the Frameworks Used
 
Appendix B. Evaluation Development Worksheets
 
Appendix C. Survey Questionnaire Make
 
Appendix D. A Sample Survey Questionnaire Measuring Multiple Dimensions, Sample Rubrics, and Reliability Testing With IBM® SPSS® Statistics
 
Appendix E. Experimental Studies and Data Analysis With t-Tests Using Excel
 
Glossary
 
References
 
Index

Supplements

Instructor Teaching Site

Password-protected Instructor Resources include:

·         Chapter quizzes including pre-written, editable multiple choice and short-answer questions help assess students’ progress and understanding.

·         Editable, chapter-specific Microsoft® PowerPoint® slides offer ease and flexibility in creating a multimedia presentation for your course.

·         A sample syllabus provides a suggested model for structuring your evaluation course.

·         All figures and tables from the book available for download.


Student Study Site

The open-access Student Study site includes downloadable versions of the templates and worksheets in the book:

·         Sample Scope of Work for a real evaluation (Ch. 4)

·         Gantt chart template for planning and scheduling (Ch. 4)

·         Sample evaluation proposal and final report (Ch. 8)

·         Worksheet for identifying, planning, and conducting an evaluation project  (Appendix B)


“This is a very well written book. It is easy to read, follow, and the application of the material from chapter to chapter is well constructed.”

Charles E. Moreland
Barry University

“Yonnie Chyung has clearly and concisely discussed a ten-step process for evaluation that will appeal to scholarly practitioners across multiple disciplines. Incorporated throughout the text are user-friendly examples, tables, and samples.”

Jennifer Fellabaum-Toston
University of Missouri

10-Step Evaluation for Training and Performance Improvement provides tools for practitioner, students, professors, evaluators, and so many more to address questions as they relate to practical program evaluation. This text offers a solid theoretical framework while offering practicality and readability to its audiences. The tools provided within the text share a best practice point of view that are easily adaptable to many situations and various environments.”

Mary Leah Coco
Louisiana Department of Transportation and Development

"This book was an exceptional point-by-point, systematic process for my students to develop project-based learning cases of their own. Overall, it was a practical application to program evaluation."

Dr. Suzanne Ensmann
The University of Tampa

Sample Materials & Chapters

Preface

Introduction


For instructors

Purchasing options

Please select a format:

ISBN: 9781544323961
£103.00

SAGE Research Methods is a research methods tool created to help researchers, faculty and students with their research projects. SAGE Research Methods links over 175,000 pages of SAGE’s renowned book, journal and reference content with truly advanced search and discovery tools. Researchers can explore methods concepts to help them design research projects, understand particular methods or identify a new method, conduct their research, and write up their findings. Since SAGE Research Methods focuses on methodology rather than disciplines, it can be used across the social sciences, health sciences, and more.

With SAGE Research Methods, researchers can explore their chosen method across the depth and breadth of content, expanding or refining their search as needed; read online, print, or email full-text content; utilize suggested related methods and links to related authors from SAGE Research Methods' robust library and unique features; and even share their own collections of content through Methods Lists. SAGE Research Methods contains content from over 720 books, dictionaries, encyclopedias, and handbooks, the entire “Little Green Book,” and "Little Blue Book” series, two Major Works collating a selection of journal articles, and specially commissioned videos.