This section provides a list of a variety of useful evaluation resources, including the following:

EERE Evaluation Resource Documents

Listed below are evaluation resource documents from the Office of Energy Efficiency and Renewable Energy (EERE).

(1) EERE Impact Evaluation Method Guide for Justice 40, Equity, and Workforce Diversity Goals. December 2023.

Prepared by EERE Strategic Analysis Office and Booz Allen Hamilton. This document is a method guide describing an approach to estimate the impact on populations in disadvantaged communities and minority serving universities of EERE’s multi-billion investments in Justice 40, diversity, equity, inclusion, and accessibility (DEIA), and workforce. The Guide describes the EERE portfolios of interest, key evaluation questions and metrics of interest, treatment and comparison groups, potential limitations, and recommendations for future evaluators who may conduct impact evaluation studies. The Guide recommends that the study conducted based on this method be a longitudinal quasi-experimental evaluation. The evaluation contractor who conducts the impact study is expected to implement a quasi-experimental design using matched comparison groups and pre-post difference in difference (DID) estimation. DID is believed to be the most feasible research design. Evaluation questions would then be answered through statistical analysis involving comparison of means, significance testing, ANOVA, regression, etc.

(2) Peer Review Guidance December 2020 (EERE G 413.001). 2020.

This document provides the minimum requirements for EERE Offices to develop and maintain In-Progress Peer Reviews as part of their program management process.  Considerable flexibility is necessary in how EERE Offices conduct Peer Reviews. To ensure independence, quality and rigor while still allowing flexibility, minimum requirements are set as key guides for EERE’s practice of Peer Review. This guidance provides information and examples useful for planning, conducting, and utilizing Peer Reviews based on best practices. Best practices are those that are (1) utilized with the most success by EERE’s own programs or by other institutions, or (2) identified as such by multiple widely recognized experts outside of EERE, including experts at the Government Accountability Office (GAO) and Office of Management and Budget (OMB). The guidance describes EERE peer review minimum requirements in the following planning and implementation areas:

  • Scope and Frequency of Review
  • Timely Preparation
  • Core Evaluation Criteria
  • Independent External Reviewers
  • Collect Reviewer Data
  • Produce the peer review reports – Reviewer Report and Peer Review Report, and their Distribution
  • Maintain Peer Review Record and Conduct Post-review Feedback Evaluation 

(3) Project Manager’s Guide to Managing Impact and Process Evaluation StudiesAugust 2015.

Prepared by: Yaw O. Agyeman (Lawrence Berkeley Laboratory) and Harley Barnes (Lockheed Martin)

This report provides a step-by-step approach to help managers of EERE evaluation projects create and manage objective, high quality, independent, and useful impact and process evaluations. It provides information to help with the following: Determine why, what and when to evaluate; Identify the questions that need to be answered in an evaluation study; Specify the type of evaluation(s) needed; Hire a qualified independent third-party evaluator; Monitor the progress of the evaluation study; Implement credible quality assurance (QA) protocols; Ensure the evaluation report presents accurate and useful findings and recommendations; Ensure that the findings get to those who need them; and Ensure findings are put to appropriate use.

(4) A Framework for Evaluating R&D Impacts and Supply Chain Dynamics Early in a Product Life Cycle. 2014.

Prepared by: Gretchen Jordan (360 Innovation LLC), Jonathan Mote (George Washington University), Rosalie Ruegg (TIA Consulting Inc.), Thomas Choi (Arizona State University), and Angela Becker-Dippmann (Pacific Northwest National Laboratory)

This report provides a framework for evaluation of R&D investments aimed at speeding up the pace of innovation and strengthening domestic manufacturing and supply chains. The framework described in this report provides a view of dynamics unfolding in the "black box of innovation" during early phases of the product life cycle. This early period of focus can be contrasted with the long-term period of impact evaluation that seeks to measure ultimate results.

The framework helps users understand, measure, and enhance the ingredients and early processes that will determine long-term impact. It adds analysis of product value chain networks to the evaluators' toolbox as a means of assessing early changes in a targeted product's domestic supply chain and value chain. The framework identifies core progress and impact metrics for analyzing changes in a product value chain, and it provides an approach for assessing DOE attribution in detail if warranted and feasible.

(5) Evaluating Realized Impacts of DOE/EERE R&D Programs. 2014 Final Report.

Prepared by: Rosalie Ruegg (TIA Consulting Inc.), Alan C. O'Connor (RTI International), and Ross J. Loomis (RTI International)

This document provides guidance for evaluators who conduct impact assessments to determine the "realized" economic benefits and costs, energy, environmental benefits, and other impacts of the EERE R&D programs. The approach described in this guide has a primary focus on realized results and the extent they can be attributed to the efforts of an R&D program.

(6) Overview of Evaluation Methods for R&D Programs. 2007.

Prepared by: Rosalie Ruegg and Gretchen Jordan (Sandia National Laboratories)

This booklet introduces managers to a variety of methods for evaluating R&D programs. It provides an overview of 14 evaluation methods that have proven useful to R&D program managers in federal agencies. Each method is briefly defined, its uses are explained, its limitations are listed, examples of successful use by other R&D managers are provided and references are given.

(7) Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects. 2007. (Main Report).

Prepared by: John H. Reed (Innovologie LLC), Gretchen Jordan (Sandia National Laboratories) and Edward Vine (Lawrence Berkeley National Laboratory)

This document describes a framework for evaluating the retrospective impact of technology deployment programs. Program managers and evaluators in federal, state, and local governments and in public entities and institutions are increasingly accountable for delivering and demonstrating results of their programs. The impact framework assists program managers and evaluators to develop and implement better and more cost-effective evaluations.

The seven-step process, the generic templates, the generic evaluation questions, and the evaluation designs that make up the framework can be used to develop powerful and meaningful impact evaluations to refine programs, increase program effectiveness, make the tough decisions to drop ineffective program elements, and to develop credible evidence that communicates the value of the program to stakeholders.

An important focus of the framework is increasing understanding of the linkages between program outputs and short-term and long-term outcomes (impacts). It simplifies and enriches the process of describing and developing measures of target audience response to program outputs, designing sound evaluations, and taking credit for all effects that are attributable to the program. Created for the EERE, the framework can be applied to most deployment programs in a broad array of disciplines.

(8) Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects. (An Overview and Example). 2007.

Prepared by: John H. Reed (Innovologie LLC), Gretchen Jordan (Sandia National Laboratories) and Edward Vine (Lawrence Berkeley National Laboratory)

This is a twelve-page overview and an application example for the larger report "Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects.

(9) "Value of Program Evaluation" Case Study Series: Sponsored by DOE. 2009.

(10) A Proposed Methodology to Determine the Leverage Impacts of Technology Deployment Programs. 2008.

Prepared by James L. Wolf for the U.S. Department of Energy, Washington, D.C.

This report describes a methodology developed for leverage estimation that is relevant and useful to technology deployment programs in EERE. The methodology recommended is intended to be helpful to the development of new standards concerning the calculation of leverage that would be deemed valid and defensible not only within EERE, but also across all technology deployment programs in the public sector, generally.

(11) Stage Gate Review:

Note: The Stage Gate guidance documents were prepared by NREL for the EERE Industrial Technologies Program (now Advanced Manufacturing Office) and Biomass (now Bioenergy) Program (sponsors).

(12) Quality Assurance for General Program Evaluation Studies. 2006.

The purpose of this document is to ensure that general program evaluation studies are performed well and have credibility with program managers, EERE Senior Management, Office of Management and Budget (OMB), Congress, and other stakeholders. It is applicable for general program evaluation studies (inclusive of outcome, impact and cost-benefit evaluations, market assessments, and any non-peer review process evaluations) sponsored by EERE programs and offices.


Back to top

Evaluation Journals

Back to top

 

Evaluation Data Systems

The U.S. Department of Energy (DOE) has several database tools available to help programs carry out analysis of knowledge creation and dissemination accomplishments associated with the efforts of Science and Technology (S&T) and Research and Development (R&D) programs. The Visual Patent Search tool was developed by DOE's Office of Energy Efficiency and Renewable Energy's (EERE) and is now managed by DOE Office of Technology Transitions. The DOE Patents was developed by DOE's Office of Scientific and Technical Information (OSTI).

  • Visual Patent Search: In 2011, DOE announced the launch of a website Visual Patent Search. This website provides a tool designed to provide a facilitated search of the patent content for published U.S. patent applications and issued U.S. patents created using Department of Energy funding.
  • DOE Patents: In 2007 the U.S. Department of Energy (DOE) announced the launch of a website, DOE Patents, hosted by OSTI, which allows search and retrieval of information from a collection of more than 20,000 patent records.[GS1] [GS2] 

Back To Top

 

Evaluation Training Opportunities

Back to top

 

 

Related

Program Evaluation
Learn how evaluations are used to assess whether EERE programs met planned technical goals and achieved commercialization and market results.
Learn more
Types of Evaluations
EERE uses several types of evaluation to quantify impacts, assess progress, and promote improvement.
Learn more