You are here

Training Evaluations

The Office of Learning and Workforce Development (HC-20) is working to find ways to measure the effectiveness and efficiency of our training solutions. This effort is to ensure that DOE is in compliance with 2009 OPM regulations, which require that agencies evaluate their training programs annually to determine their effectiveness (5 CFR 410.202). Our goal is to demonstrate the contribution of training to agency mission effectiveness, increase partnership with our key stakeholders and show clear connections between effective funding usage and agency outcomes. In order to accomplish this we have created a comprehensive strategy and standards for evaluating training that will demonstrate alignment between the learning experience and the intended outcomes and objectives. Our strategy includes the following four components: monitor, track, and report; develop tools; continuous improvement; and sharing results with the DOE training community. Our standards for evaluation training are based on Kirkpatrick’s Levels of Evaluation.

  • Strategy Component 1: The first component of our strategy is to monitor, track and report on mandatory supervisory, general mission-critical (MCCs), and performance management courses. The process for accomplishing this involves running Level 1 and Level 3 Training Evaluation Summaries for upcoming courses and conducting Training Program Evaluations for our training programs.
  • Strategy Component 2: The second component is to develop tools to help measure the effectiveness and efficiency of our training solutions. The tools that are in development include: a CHRIS Training Evaluation Report; a revised CHRIS User Manual, a standardized Level 1 Training Evaluations Summary, a standardized Level 3 Training Evaluation Summary for employees and supervisors, a standardized process for Training Program Evaluation, and a Training Program Evaluations Guide.
  • Strategy Component 3: The third component is the continuous improvement of our training evaluation solutions. We will recommend opportunities for improvement based on data collected from Level 1 and Level 3 Training Evaluations and Training Program Evaluations. We can then revise courses and programs for improved effectiveness and efficiency.
  • Strategy Component 4: The fourth component is sharing this information with our DOE training community. We will conduct briefings with HC-20 partners and provide status reports at the Monthly Training Managers meetings and Learning & Development Board of Directors meetings. Regular updates will also be provided on this page, the DOE Virtual University and The Collaborator.

What we are producing:

Course Evaluations will be performed on mandatory supervisory, general mission-critical competencies (MCCs), and performance management courses

  • We will provide Level 1 Training Evaluation Summaries 2 months prior to upcoming courses on our Course Evaluation page
  • Training Administrators will be able to pull Level 1 Training Evaluation Summaries from CHRIS
  • Conduct Level 3 Training Evaluations 3 months after course completion
  • Provide Level 3 Training Evaluation Summaries
  • Analyze data collected from Level 1 Training Evaluation Summaries and Level 3 Training Evaluation Summaries to provide opportunities for course improvement

Conduct Training Program Evaluations on the following programs:

  • Leadership Transition Program
  • DOE STEM Mentoring Program
  • Senior Executive Service Candidate Development Program (SESCDP)
  • 1st Time Supervision Development Program

Analyze data collected from Training Program Evaluations to provide opportunities for program improvement