This section provides a list of a variety of useful evaluation resources, including the following:
- EERE evaluation resource documents
- Evaluation journals
- Evaluation data systems
- Evaluation training opportunities
EERE Evaluation Resource Documents
Listed below are evaluation resource documents from the Office of Energy Efficiency and Renewable Energy (EERE).
(1) Peer Review Guidance December 2020 (EERE G 413.001). 2020.
This document provides the minimum requirements for EERE Offices to develop and maintain In-Progress Peer Reviews as part of their program management process. Considerable flexibility is necessary in how EERE Offices conduct Peer Reviews. To ensure independence, quality and rigor while still allowing flexibility, minimum requirements are set as key guides for EERE’s practice of Peer Review. This guidance provides information and examples useful for planning, conducting, and utilizing Peer Reviews based on best practices. Best practices are those that are (1) utilized with the most success by EERE’s own programs or by other institutions, or (2) identified as such by multiple widely recognized experts outside of EERE, including experts at the Government Accountability Office (GAO) and Office of Management and Budget (OMB). The guidance describes EERE peer review minimum requirements in the following planning and implementation areas:
- Scope and Frequency of Review
- Timely Preparation
- Core Evaluation Criteria
- Independent External Reviewers
- Collect Reviewer Data
- Produce the peer review reports – Reviewer Report and Peer Review Report, and their Distribution
- Maintain Peer Review Record and Conduct Post-review Feedback Evaluation
(2) Project Manager’s Guide to Managing Impact and Process Evaluation Studies. August 2015.
Prepared by: Yaw O. Agyeman (Lawrence Berkeley Laboratory) and Harley Barnes (Lockheed Martin)
This report provides a step-by-step approach to help managers of EERE evaluation projects create and manage objective, high quality, independent, and useful impact and process evaluations. It provides information to help with the following: Determine why, what and when to evaluate; Identify the questions that need to be answered in an evaluation study; Specify the type of evaluation(s) needed; Hire a qualified independent third-party evaluator; Monitor the progress of the evaluation study; Implement credible quality assurance (QA) protocols; Ensure the evaluation report presents accurate and useful findings and recommendations; Ensure that the findings get to those who need them; and Ensure findings are put to appropriate use.
(3) A Framework for Evaluating R&D Impacts and Supply Chain Dynamics Early in a Product Life Cycle. 2014.
Prepared by: Gretchen Jordan (360 Innovation LLC), Jonathan Mote (George Washington University), Rosalie Ruegg (TIA Consulting Inc.), Thomas Choi (Arizona State University), and Angela Becker-Dippmann (Pacific Northwest National Laboratory)
This report provides a framework for evaluation of R&D investments aimed at speeding up the pace of innovation and strengthening domestic manufacturing and supply chains. The framework described in this report provides a view of dynamics unfolding in the "black box of innovation" during early phases of the product life cycle. This early period of focus can be contrasted with the long-term period of impact evaluation that seeks to measure ultimate results.
The framework helps users understand, measure, and enhance the ingredients and early processes that will determine long-term impact. It adds analysis of product value chain networks to the evaluators' toolbox as a means of assessing early changes in a targeted product's domestic supply chain and value chain. The framework identifies core progress and impact metrics for analyzing changes in a product value chain, and it provides an approach for assessing DOE attribution in detail if warranted and feasible.
(4) Evaluating Realized Impacts of DOE/EERE R&D Programs. 2014 Final Report.
Prepared by: Rosalie Ruegg (TIA Consulting Inc.), Alan C. O'Connor (RTI International), and Ross J. Loomis (RTI International)
This document provides guidance for evaluators who conduct impact assessments to determine the "realized" economic benefits and costs, energy, environmental benefits, and other impacts of the EERE R&D programs. The approach described in this guide has a primary focus on realized results and the extent they can be attributed to the efforts of an R&D program.
(5) Overview of Evaluation Methods for R&D Programs. 2007.
Prepared by: Rosalie Ruegg and Gretchen Jordan (Sandia National Laboratories)
This booklet introduces managers to a variety of methods for evaluating R&D programs. It provides an overview of 14 evaluation methods that have proven useful to R&D program managers in federal agencies. Each method is briefly defined, its uses are explained, its limitations are listed, examples of successful use by other R&D managers are provided and references are given.
(6) Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects. 2007. (Main Report).
Prepared by: John H. Reed (Innovologie LLC), Gretchen Jordan (Sandia National Laboratories) and Edward Vine (Lawrence Berkeley National Laboratory)
This document describes a framework for evaluating the retrospective impact of technology deployment programs. Program managers and evaluators in federal, state, and local governments and in public entities and institutions are increasingly accountable for delivering and demonstrating results of their programs. The impact framework assists program managers and evaluators to develop and implement better and more cost-effective evaluations.
The seven step process, the generic templates, the generic evaluation questions, and the evaluation designs that make up the framework can be used to develop powerful and meaningful impact evaluations to refine programs, increase program effectiveness, make the tough decisions to drop ineffective program elements, and to develop credible evidence that communicates the value of the program to stakeholders.
An important focus of the framework is increasing understanding of the linkages between program outputs and short-term and long-term outcomes (impacts). It simplifies and enriches the process of describing and developing measures of target audience response to program outputs, designing sound evaluations, and taking credit for all effects that are attributable to the program. Created for the EERE, the framework can be applied to most deployment programs in a broad array of disciplines.
(7) Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects. (An Overview and Example). 2007.
Prepared by: John H. Reed (Innovologie LLC), Gretchen Jordan (Sandia National Laboratories) and Edward Vine (Lawrence Berkeley National Laboratory)
This is a twelve-page overview and an application example for the larger report "Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects.
(8) "Value of Program Evaluation" Case Study Series: Sponsored by DOE. 2009.
- Introduction to the "Value of Program Evaluation" Case Study Series, November 2009
- "DOE Hydrogen Program Saved Nearly $30 Million by Investing in Annual In-Progress Peer Reviews" (U.S. DOE, Energy Efficiency & Renewable Energy)
- "Evaluation Helps Program Increase Sales of Energy Saving Light Bulbs Among Women" (ENERGY STAR® Products Program, Wisconsin Focus on Energy)
- "Evaluation Helps Pesticide Program Finish Project Four Years Sooner Than Estimated" (Office of Pesticide Programs, U.S. EPA)
- "Evaluation Prompts ENERGY STAR Program to Replace Web Tool, Saving 90 Percent of Annual Costs" (ENERGY STAR Residential Program, U.S. EPA)
- "Programs Streamline Process, Add Customers More Quickly After Implementing Evaluation Recommendations" (Empower Programs, Hydro-Québec)
(9) A Proposed Methodology to Determine the Leverage Impacts of Technology Deployment Programs. 2008.
Prepared by James L. Wolf for the U.S. Department of Energy, Washington, D.C.
This report describes a methodology developed for leverage estimation that is relevant and useful to technology deployment programs in EERE. The methodology recommended is intended to be helpful to the development of new standards concerning the calculation of leverage that would be deemed valid and defensible not only within EERE, but also across all technology deployment programs in the public sector, generally.
(10) Stage Gate Review:
- Advanced Manufacturing Office's Stage-Gate Innovation Management Guidelines. 2007. Prepared by The National Renewable Energy Laboratory (NREL)
- Stage Gate Management in the Biomass Program. 2005. Revision 2. Prepared by The National Renewable Energy Laboratory (NREL)
Note: The Stage Gate guidance documents were prepared by NREL for the EERE Industrial Technologies Program (now Advanced Manufacturing Office) and Biomass (now Bioenergy) Program (sponsors).
(11) Quality Assurance for General Program Evaluation Studies. 2006.
The purpose of this document is to ensure that general program evaluation studies are performed well and have credibility with program managers, EERE Senior Management, Office of Management and Budget (OMB), Congress, and other stakeholders. It is applicable for general program evaluation studies (inclusive of outcome, impact and cost-benefit evaluations, market assessments, and any non-peer review process evaluations) sponsored by EERE programs and offices.
Evaluation Journals
- American Journal of Evaluation
- Evaluation: The International Journal of Theory, Research and Practice
- Evaluation and Program Planning
- Evaluation Review: A Journal of Applied Social Research
- New Directions for Evaluation
Evaluation Data Systems
The U.S. Department of Energy (DOE) has several database tools available to help programs carry out analysis of knowledge creation and dissemination accomplishments associated with the efforts of Science and Technology (S&T) and Research and Development (R&D) programs. The Visual Patent Search tool was developed by DOE's Office of Energy Efficiency and Renewable Energy's (EERE) and is now managed by DOE Office of Technology Transitions. The DOE Patents was developed by DOE's Office of Scientific and Technical Information (OSTI).
- Visual Patent Search: In 2011, DOE announced the launch of a website Visual Patent Search. This website provides a tool designed to provide a facilitated search of the patent content for published U.S. patent applications and issued U.S. patents created using Department of Energy funding.
- DOE Patents: In 2007 the U.S. Department of Energy (DOE) announced the launch of a website, DOE Patents, hosted by OSTI, which allows search and retrieval of information from a collection of more than 20,000 patent records.[GS1] [GS2]
Evaluation Training Opportunities
- American Evaluation Association (AEA) training workshops
- Claremont Evaluation Center’s Professional Development Workshop Series in Evaluation and Applied Research Methods
- The EnCompass Learning Center
- University Programs, for graduate programs around the U.S.
- Eastern Evaluation Research Society annual conference workshops
- EPA logic model training
- International Energy Program Evaluation Conference (IEPEC) training workshops
- OMB, GSA OES Evaluation and Evidence Training Series
- The Evaluators’ Institute, Washington, DC; Claremont, CA; and Chicago, IL.
- Washington Evaluators brownbag seminars