Human Capital: Selected Agencies' Experiences and Lessons Learned
in Designing Training and Development Programs (30-JAN-04,	 
GAO-04-291).							 
                                                                 
Effective training and development programs are an integral part 
of a learning environment, helping improve federal workforce	 
performance in achieving agency results. Therefore, in this	 
report GAO was asked to identify examples of selected federal	 
agencies' experiences and some of the key lessons they have	 
learned in designing their training and development programs.	 
This work focused on ways that these agencies (1) assessed agency
skills gaps and identified training needs, (2) developed	 
strategies and solutions for these training and development	 
needs, and (3) determined methods to evaluate the effectiveness  
of training and development programs. GAO worked with five	 
agencies to identify their experiences and lessons learned: the  
U.S. Army Corps of Engineers (USACE), Department of Defense; Fish
and Wildlife Service (FWS), Department of the Interior		 
(Interior); Internal Revenue Service (IRS), Department of the	 
Treasury; the Office of Personnel Management (OPM); and Veterans 
Health Administration (VHA), Department of Veterans Affairs (VA).
Agency officials provided information during interviews and	 
furnished supporting documentation for analysis and review.	 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-04-291 					        
    ACCNO:   A09182						        
  TITLE:     Human Capital: Selected Agencies' Experiences and Lessons
Learned in Designing Training and Development Programs		 
     DATE:   01/30/2004 
  SUBJECT:   Federal agencies					 
	     Federal employees					 
	     Personnel management				 
	     Strategic planning 				 
	     Training utilization				 
	     Best practices					 
	     Best practices reviews				 
	     Employment or training programs			 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-04-291

United States General Accounting Office

GAO	Report to the Chairman, Subcommittee on Oversight of Government Management,
 the Federal Workforce, and the District of Columbia, Committee on Governmental
                              Affairs, U.S. Senate

January 2004

HUMAN CAPITAL

  Selected Agencies' Experiences and Lessons Learned in Designing Training and
                              Development Programs

                                       a

GAO-04-291

Highlights of GAO-04-291, a report to the Chairman, Subcommittee on
Oversight of Government Management, the Federal Workforce, and the
District of Columbia, Committee on Governmental Affairs, U.S. Senate

Effective training and development programs are an integral part of a
learning environment, helping improve federal workforce performance in
achieving agency results. Therefore, in this report GAO was asked to
identify examples of selected federal agencies' experiences and some of
the key lessons they have learned in designing their training and
development programs. This work focused on ways that these agencies (1)
assessed agency skills gaps and identified training needs, (2) developed
strategies and solutions for these training and development needs, and (3)
determined methods to evaluate the effectiveness of training and
development programs.

GAO worked with five agencies to identify their experiences and lessons
learned: the U.S. Army Corps of Engineers (USACE), Department of Defense;
Fish and Wildlife Service (FWS), Department of the Interior (Interior);
Internal Revenue Service (IRS), Department of the Treasury; the Office of
Personnel Management (OPM); and Veterans Health Administration (VHA),
Department of Veterans Affairs (VA). Agency officials provided information
during interviews and furnished supporting documentation for analysis and
review.

www.gao.gov/cgi-bin/getrpt?GAO-04-291.

To view the full product, including the scope and methodology, click on
the link above. For more information, contact George Stalcup at (202)
512-6806 or [email protected].

January 2004

HUMAN CAPITAL

Selected Agencies' Experiences and Lessons Learned in Designing Training and
Development Programs

GAO identified important lessons learned from five federal agencies'
experiences in designing training and development programs for their
employees that could be useful to other agencies facing similar
challenges. These lessons learned are related to the following three
areas.

Four of the five agencies provided comments on a draft of this report.
Interior and VA said that they generally agreed with the report's findings
regarding their respective agencies. IRS and OPM said that they
appreciated the opportunity to be included in the report and to share
information on training activities. USACE provided no comments on the
draft report.

Contents

  Letter

Results in Brief
Background
Agencies Used Varied Approaches in Assessing Skills and

Competencies and Identifying Related Training Needs Agencies Developed
Strategies and Solutions for Their Training Needs

Agencies are Considering More Sophisticated Evaluation Approaches As Part
of Designing their Training and Development Programs

Conclusions and Observations Agency Comments and Our Evaluation

1 2 5

8

21

35 42 43

Appendixes

Appendix I:

Appendix II:

Appendix III:

Appendix IV:

                                       Appendix V: Appendix VI: Appendix VII:

Objective, Scope and Methodology

Background on Selected Agencies and Their Training and Development
Functions

Core Characteristics of a Strategic Training and Development Process

Comments from the Department of the Interior

GAO Comment

Comments from the Internal Revenue Service Comments from the Office of
Personnel Management GAO Contacts and Staff Acknowledgments

GAO Contacts Acknowledgments

                                       45

                                       48

                                       51

                                     53 54

55

56

58 58 58

Figures Figure 1:

Figure 2: Figure 3: Figure 4:

Figure 5:

Figure 6:

Four Components of a Strategic Training and
Development Process 7
Strategic Workforce Planning Process 10
OPM's Five-step Workforce Planning Process 13
IRS's Core Management Responsibilities and Leadership
Competencies 15
Competencies in VHA's High Performance Development
Model 16
Steps for Developing Strategies and Solutions for
Training and Development Needs 22

Contents

Figure 7:	Steps in Determining Methods for Evaluating Training Programs 36

This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.

A

United States General Accounting Office Washington, D.C. 20548

January 30, 2004

The Honorable George V. Voinovich

Chairman

Subcommittee on Oversight of Government Management, the Federal Workforce,
and the District of Columbia Committee on Governmental Affairs United
States Senate

Dear Mr. Chairman:

To effectively address the nation's most urgent priorities and take
advantage of emerging opportunities, federal agencies need to continue to
build their fundamental capabilities to carry our their work in an
environment that is increasingly complex and rapidly changing. To build
their capacity, agencies should invest resources wisely to ensure that
their employees have the information, skills, and competencies they need
to succeed. As you are well aware, these investments must include training
and development efforts to continuously enhance the skills and
competencies of the federal workforce and improve the quality of agencies'
results.

As agreed with your office, this report provides information on selected
federal agencies' experiences and lessons learned in key aspects of
designing training and development programs for their employees.
Specifically, we focused on the agencies' experiences and lessons learned
related to

o 	assessing current and future agency skill and competency requirements
and identifying related training and development needs,

o 	developing strategies and solutions for training and development needs,
and

o 	determining methods to evaluate the effectiveness of training and
development programs.

For this review, lessons learned were defined as knowledge that could be
applied in the future that the agencies gained through either positive or
negative experiences. The experiences and lessons learned from the five
agencies we reviewed may well provide some valuable ideas and useful
approaches that could be adopted by other federal agencies as they attempt

to address ongoing training and development challenges-particularly those
related to the elements within the training process that relate to design
and development.

To address these issues and as agreed with your office, we focused our
review on five federal agencies: U.S. Army Corps of Engineers (USACE),
Department of Defense; Fish and Wildlife Service (FWS), Department of the
Interior (Interior); Internal Revenue Service (IRS), Department of the
Treasury; Office of Personnel Management (OPM); and Veterans Health
Administration (VHA), Department of Veterans Affairs (VA). At the
headquarters of the five agencies-and in some field locations-we collected
and reviewed documents on the agencies' training and development efforts
and interviewed officials from the agencies' human capital and training
organizations, as well as assorted program offices. We selected the five
agencies for various reasons, including your office's interests, the
diversity of employee occupations within the agency, and reported
innovative approaches for training and developing their employees. This
agency selection process was not designed to identify examples that could
be considered representative of all training and development efforts at
the five agencies or the federal government as a whole. Furthermore, in
citing examples that relate to the lessons learned on the design of
training, we did not assess the effectiveness of these training programs
and practices. Rather, we attempted to highlight some of the experiences
and lessons that the agencies found helped them move forward in improving
their training and development programs. Federal agencies' training and
development strategies, and how they are designed to operate in
conjunction with other strategies to improve individual and organizational
performance, continue to change and evolve.

We conducted our review from August 2002 through November 2003 in
accordance with generally accepted government auditing standards. See
appendix I for additional information on our objective, scope, and
methodology.

Results in Brief	Officials from the five agencies in our review recognized
the importance of assessing current and future agency skill and competency
requirements to identify related workforce training needs. The agencies
used several approaches to design training and development programs that
focused on the skills and competencies their assessments indicated needed
enhanced attention. The officials emphasized that agencies are
transitioning to more formal and comprehensive planning approaches to
assess skill and

competency requirements and identify related training and development
needs-primarily as part of broader efforts to incorporate workforce
planning into ongoing strategic planning and budgeting processes focused
on achieving results. The following are lessons learned identified during
this review related to assessing skill and competency requirements and
identifying training needs.

o 	Involve key stakeholders and benchmark with other organizations when
identifying skills and competencies to help ensure that training and
development programs are aligned with current and emerging needs and
business practices.

o 	Analyze existing agency data on employees' skills and competencies and
information from performance appraisals to help identify skills and
competencies that need to be addressed throughout the agency as well as on
an individual basis.

o 	Link the agency's workforce planning efforts with training needs
assessments to ensure consistency and enhance strategic alignment.

o 	Consider the training needs of staff from other organizations that will
likely use the agency's training programs or facilities to effectively
leverage training investments and meet diverse needs.

These agencies developed a wide range of strategies and solutions to
improve performance through designing training and development programs
for their employees. Officials told us they considered a mixture of both
on-the-job and other developmental programs and contemplated an assortment
of mechanisms for delivering the training as well as potential sources for
the learning need. Agency officials have found projecting costs and
benefits of proposed training and development programs to be very
challenging. Although they sometimes developed broad information on
anticipated benefits and expected costs, this often did not involve tying
anticipated benefits to specific performance improvements or considering
all related costs. Our review identified the following lessons learned in
developing strategies and solutions when designing agency training and
development programs.

o 	Incorporate information on employees' various competency levels and job
needs into the design of training and development programs to increase
their relevancy and timeliness.

o 	Assess options for using other organizations' course content, staff,
services, or facilities when designing a new training and development
program in order to develop efficient and cost-effective strategies.

o 	Establish mechanisms and controls to avoid unnecessary duplication or
inconsistency within and across agencies' training efforts.

o 	Develop and use criteria for determining the optimal mix of delivery
mechanisms to use in order to select the most effective approaches given
each learning situation.

o 	Ensure that employees have the needed equipment and technologies so
that they can take maximum advantage of learning opportunities.

o 	Plan early when developing integrated solutions that complement other
planned and ongoing strategies to improve performance so that when
implemented the strategies work effectively and are aligned to help
achieve agency goals.

o 	Plan for the direct participation of senior agency leaders and
experienced staff in the delivery of training and development programs to
increase buy-in and build support for organizational change.

Evaluating training programs is key to ensuring that training and
development programs are effective. Overall, the five agencies relied
primarily on standard end-of-course evaluations to obtain the
participants' reaction to, and satisfaction with, a specific training
course or learning opportunity. However, officials said that they have
begun or are planning to use more comprehensive and sophisticated
techniques for assessing the extent to which training and development
programs increased employees' knowledge and skills and enhanced individual
and organizational performance. These techniques included pre- and
post-testing, tracking changes in individual and program performance, and
some limited use of return-on-investment (ROI) analyses. The lessons
agencies learned in designing methods to evaluate training and development
programs included the following.

o 	Incorporate appropriate aspects of the evaluation approach when
designing training and development programs by specifying what results are
expected to better ensure the availability and use of quality performance
data.

o 	Consider new approaches for collecting and analyzing performance data
with the aim of increasing the quality and quantity of training evaluation
feedback.

o 	Plan for the use of multiple data types and sources to provide a
balanced approach in assessing the effectiveness of training and
development programs.

o 	Take into account all relevant factors for determining the costs of a
training and development program to better ascertain whether it is
costeffective in relation to benefits achieved.

Four of the five selected agencies provided comments on a draft of this
report. Interior and VA said that they generally agreed with the report's
findings relating to their respective agencies. IRS said that it was
honored to share some of its lessons learned with us for governmentwide
dissemination. OPM said that it appreciated the opportunity to be included
in the report and to share information on its training and development
activities and programs. USACE informed us that it had no comments on our
draft report.

Background	We recently issued an exposure draft of an assessment guide
that introduces a framework for evaluating a federal agency's training and
development efforts.1 This assessment guide consists of a set of
principles and key questions that federal agencies can use to ensure that
their training and development investments are targeted strategically and
are not wasted on efforts that are irrelevant, duplicative, or
ineffective. As detailed in our assessment guide, the training and
development process can loosely be segmented into four broad, interrelated
components: (1) planning/frontend analysis, (2) design/development, (3)
implementation, and (4) evaluation. Figure 1 depicts an overview of this
process along with the general relationships between the four components
that help to produce a strategic approach to federal agencies' training
and development efforts. Although these components can be discussed
separately, they are not mutually exclusive and encompass subcomponents
that may blend with one another. For instance, evaluation is an integral
part of the

1 U.S. General Accounting Office, Human Capital: A Guide for Assessing
Strategic Training and Development Efforts in the Federal Government
-Exposure Draft, GAO-03893G (Washington, D.C.: July 2003).

planning/front-end analysis as agencies strive to reach agreement up front
on how the success of various strategies to improve performance, including
training and development efforts, will be assessed. As noted in the
assessment guide, agencies can build on lessons learned and performance
data and feedback from previous experiences. This report can provide a
starting point for agencies to use to build on the experiences and lessons
learned by the five agencies we reviewed as part of their efforts to
design and develop training and development programs. (See app. II for a
description of the five agencies included in this study.)

Figure 1: Four Components of a Strategic Training and Development Process

                                  Source: GAO.

Our assessment guide also summarizes our observations on the core
characteristics that make a training and development process effective and
strategically focused on achieving results. These eight core
characteristics are described in more detail in appendix III, and include

o  strategic alignment,

o  leadership commitment and communication,

o  stakeholder involvement,

o  accountability and recognition,

o  effective resource allocation,

o  partnerships and learning from others,

o  data quality assurance, and

o  continuous performance improvement.

A concerted effort to integrate these core characteristics can further an
agency's efforts to continually improve its training and development
process.

Agencies Used Varied Approaches in Assessing Skills and Competencies and
Identifying Related Training Needs

Federal agencies face diverse challenges in their efforts to identify and
measure the skills and competencies that their employees must possess to
support missions and goals. Officials from the five agencies in our review
recognized the importance of assessing the need for specific skills and
competencies now and in the future in order to identify related workforce
training needs. These agencies generally focused on the desired
performance of the agency and its employees, determined the difference
between the desired and actual skill levels, and attempted to identify the
key factors contributing to performance, including the need for enhanced
workforce competencies. Officials used a variety of approaches and tools
to assist in determining the human capital skills and competencies that
are critical to achieving their long-term goals.

An agency's ultimate goal in undertaking training and development efforts
is, of course, to optimize employee and organizational performance. To
help ensure that each training program is linked to improving individual

and agency performance, agencies first need to analyze their strategic and
performance goals so that they can determine where training and
development can most effectively enhance goal achievement.

Organizations can evaluate the extent to which human capital approaches
support their accomplishment of current, emerging, and future strategic
goals through the use of workforce planning. Workforce planning focuses on
determining the skills and competencies needed now and in the future to
meet the agency's goals; identifying the current and projected level of
the skills and competencies of the workforce; and crafting strategies for
acquiring, developing, and retaining people to address any identified
needs. These needs include the knowledge, skills, and abilities needed for
the agency to pursue its current and future mission as well as the size of
the workforce and its deployment across the organization. After
identifying the skills and competencies that employees need now or in the
future, agencies must tackle the challenge of determining what combination
of strategies to use, such as hiring new employees with needed skills and
competencies, relying on outsourcing, and/or enhancing employees' skills
and competencies through training and development. While agencies'
approaches to workforce planning will vary, we have identified the need
for a strategic workforce planning process to ensure that each agency's
human capital program capitalizes on its workforce's strengths and
addresses related challenges in a manner that is clearly linked to
achieving the agency's missions and goals.2 Figure 2 presents a model of
this strategic workforce planning process.

2 U.S. General Accounting Office, Human Capital: Key Principles for
Effective Strategic Workforce Planning, GAO-04-039 (Washington, D.C.: Dec.
11, 2003).

                 Figure 2: Strategic Workforce Planning Process

The five agencies used several approaches to design training and
development programs that focused on skills and competencies their
assessments indicated needed enhanced attention. One common approach that
officials used to help identify training needs was interviewing or
surveying managers, supervisors, and employees. Agencies also established
councils and held conferences, made comparisons with leading organizations
through benchmarking, and analyzed workforce data and trends.

                                  Source: GAO.

The focus in this report is on the workforce strategies that involve the
design of training and development programs.

Agencies' Experiences in Assessing Skill and Competency Requirements and
Identifying Training Needs

OPM's 2001 skills assessment, for example, relied on a survey of agency
managers and supervisors.3 Officials used the survey results to identify
the most important occupational competencies needed to achieve OPM's
mission, the level at which employees possessed those competencies, and
the level at which they would be needed in the future. Their analysis
identified minor gaps in the level of competencies needed for both current
and future work in mission critical occupations. In addition, the analysis
pointed out more serious developmental needs for OPM's retirement and
insurance benefits specialists. These needs were related to the changing
role of these specialists, who increasingly need to work more closely with
clients in responding to complex issues.

Also in 2001, IRS established a workforce planning council consisting of
senior management representatives from each of the agency's operating
divisions. IRS officials told us that this council has become the primary
vehicle for communicating workforce planning information among IRS's four
operating divisions. At FWS, the human resources office hosted a 3day
workforce planning conference in 2002 to draw on the experience and
expertise of agency personnel in identifying critical workforce issues for
the agency for the next 3 to 5 years. Managers and program experts
representing all eight major FWS program offices, seven field regions, and
headquarters offices participated.

Agencies also compared their performance and needed skills and
competencies with leading organizations through benchmarking. In August
2000, VHA commissioned an internal task force charged with developing a
well-defined, comprehensive succession plan for the agency. The ideas
garnered from benchmarking led to VHA establishing an expectation for
agency leaders to help identify and train their successors. The task
force's December 2001 report presented a comprehensive succession plan for
VHA, and implementing a comprehensive leadership development program was
one of the six major components of this plan.

3 In providing technical comments on a draft of this report, OPM noted
that it conducted another skills assessment in late fiscal year 2003 to
reflect its recent restructuring and new strategic priorities.

Agency officials also analyzed workforce data to assess skills and help
identify training needs. Generally, they collected information on employee
demographics and retirement eligibility and used these data to project
attrition and retirement rates. OPM officials, for example, collected and
analyzed attrition and turnover data on the agency's senior executives
along with the distribution of current executives by unit and projected
retirements through 2010.4 They also collected and analyzed data on hires,
separations, and workforce diversity across the agency. As part of this
analysis, officials assessed the agency's use of contractors and
considered how sourcing alternatives could affect OPM's plans for hiring,
training, and development.

Officials from the five agencies told us that they used a wide range of
resources and tools to assess skills and competencies as part of
identifying and designing needed training and development programs. They
used workforce planning models; assessed the workforce in view of
organizational, occupational, and unit-based competency standards;
conducted knowledge and skills inventories; and evaluated job performance
appraisals and information from individual development plans (IDP).5 To
identify needed executive competencies, for example, OPM used a five-step
workforce planning model that it had developed in its role of providing
human capital tools for use by other federal agencies (see fig. 3). The
resulting analysis called for enhancing leadership development within OPM.

4 In previous work, we identified practices used by agencies in other
countries to manage the succession of senior executives and other
employees with critical skills. See Human Capital: Insights for U.S.
Agencies from Other Countries' Succession Planning and Management
Initiatives, GAO-03-914 (Washington, D.C.: Sept. 15, 2003).

5 An IDP is a written plan, cooperatively prepared by the employee and his
or her supervisor, that outlines the steps the employee will take to
develop knowledge, skills, and abilities in building on strengths and
addressing weaknesses as he or she seeks to improve job performance and
pursue career goals. These individual development plans are also known as
personal development plans, personal training plans, and individual
training plans.

Figure 3: OPM's Five-step Workforce Planning Process

1. 	Set strategic direction to drive agency operations and define how the
agency will know when and if it has succeeded.

2. Analyze the workforce, identify skill gaps, and conduct workforce
analysis.

3. 	Develop a workforce action plan that lays out specific tasks and
actions the agency needs to take in order to achieve the agency's human
resources goals and objectives.

4. 	Implement the workforce action plan by executing the schedule that
includes measurable workforce goals and milestones.

5. Monitor progress, evaluate success, and revise plan as needed.

Source: OPM.

Officials said that their agencies have transitioned, or are in the
process of transitioning, to more comprehensive, consistent planning
approaches. This transition is coming about as agencies attempt to
institutionalize their workforce planning efforts as part of their ongoing
strategic planning and budgeting processes. In 2002, FWS conducted its
first formal, agencywide workforce planning process. Although FWS
initially employed a contractor to help develop a permanent workforce
planning process, it plans to continue to manage it in-house. This
workforce planning process is to be implemented on a 2-year cycle that is
integrated with the agency's strategic planning and budgeting processes.

Agencies' Lessons Learned in Assessing Skill and Competency Requirements
and Identifying Training Needs

By considering the viewpoints of a range of stakeholders and candidly and
openly assessing progress toward meeting their goals, agencies can help
ensure that their strategic and annual performance planning processes
adequately reflect current ideas, policies, and practices in the field.
Agencies continue to integrate workforce planning into these other
planning processes. It is important to note that a wide variety of
strategies other than training and development are also available to
agency leaders as they attempt to transform their cultures and
operations.6 Training and

6 For more information on key practices and implementation steps that can
help agencies transform their cultures, see U.S. General Accounting
Office, Results-Oriented Cultures: Implementation Steps to Assist Mergers
and Organizational Transformations, GAO-03669 (Washington, D.C.: July 2,
2003).

development is not always the best solution-reengineering processes or
other actions may be needed to build an environment that effectively
supports performance. In addition, training and development strategies
frequently need to be implemented in conjunction with other initiatives,
given that the day-to-day environment and organizational culture may also
need to change to enable employees to successfully use new skills or
competencies on the job.

Our review identified four lessons learned by the agencies related to
assessing skills and competencies to identify, focus, and prioritize
training needs.

Lesson learned: Involve key stakeholders and benchmark with other
organizations when identifying skills and competencies to help ensure that
training and development programs are aligned with current and emerging
needs and business practices.

Organizations in the private and public sectors have increasingly turned
to developing competency models that outline behaviorally defined skills
and competencies employees should possess and that can be tied directly to
training and development plans and programs. We have found that an
effective performance management system uses competencies to provide a
fuller assessment of performance.7

IRS involved key stakeholders and benchmarked with other organizations in
developing its leadership competency model

7 U.S. General Accounting Office, Results-Oriented Cultures: Creating a
Clear Linkage between Individual Performance and Organizational Success,
GAO-03-488 (Washington, D.C.: Mar. 14, 2003).

In working to build its leadership development program in the wake of the
IRS Restructuring and Reform Act of 1998, IRS officials believed they
needed a leadership competency model that was based directly on the work
of IRS's business units. To identify the essential characteristics that
enable IRS employees to function as effective leaders in the newly
modernized agency, human capital specialists at IRS conducted behavioral
interviews with 35 top IRS leaders in 1999, asking them to identify major
successes and challenges during their careers. Using information gathered
from these interviews, officials identified core management
responsibilities and corresponding competencies required for leaders in
IRS. With the assistance of a contractor, IRS validated the leadership
competency model by comparing it against leading practices in the public
and private sector and linking it to the mission and goals of the agency.8
The resulting competency model now forms the basis for IRS's leadership
development efforts, as well as how IRS selects, evaluates, and recognizes
its leaders. Figure 4 shows a listing of IRS's five core management
responsibilities and the 21 corresponding leadership competencies.

  Figure 4: IRS's Core Management Responsibilities and Leadership Competencies

                                                                    Equal     
                                                                 employment   
                                                                 opportunity  
                      Employee         Customer      Business     (EEO) and   
       Leadership     satisfaction   satisfaction     results     diversity   
                                                        o                     
                      o  Continual                  Achievement               
o  Adaptability  o learning  o    o  Customer    orientation 
    Communication  o  Developing      focus  o           o      
    Decisiveness  o   others  o    Entrepreneurship  Business   
Integrity/honesty  Diversity       o  External   acumen  o    Supporting
       o  Service     awareness  o  awareness  o     Political  competenciesa
     motivation  o     Group         Influencing/    savvy  o   
Strategic thinking leadership   negotiating  o     Problem   
                      o  Teamwork     Partnering    solving  o  
                                                     Technical  
                                                    credibility 

Source: IRS.

aItalicized competencies support the "EEO and diversity" responsibility.

8 For more information on performance management of senior executives at
IRS, see U.S. General Accounting Office, Results-Oriented Cultures: Using
Balanced Expectations to Manage Senior Executive Performance, GAO-02-966
(Washington, D.C.: Sept. 27, 2002).

VHA benchmarked with other organizations and used a pilot test in
developing its high performance development model

VHA conducted an extensive literature search and benchmarked with several
leading private sector firms (including Bell South, Coca-Cola, and
Motorola) to provide a foundation for its effort to create a new
competency model for VHA employees. VHA used this information in
developing its high performance development model, which was implemented
throughout the department in 2002. This model consists of eight core
competencies and related performance tools that represent the major skills
and competencies that employees need to fulfill VHA's mission. The model
was designed to serve as a framework for identifying and developing future
leaders, as well as to enhance development of VHA's entire workforce. VHA
said that using this model helped more effectively align training and
development programs with agency priorities. According to VHA, the fact
that the core competencies apply to all levels and functions within the
agency helps ensure alignment within and between organizational units and
is a key component in motivating sustained and improved performance. VHA
also uses the model on an individual employee basis as a process for
identifying specific developmental needs. Figure 5 lists the eight core
competencies in VHA's high performance development model.

Figure 5: Competencies in VHA's High Performance Development Model

o  Personal mastery  o  Technical skills

o  Interpersonal effectiveness  o  Customer service

o  Flexibility and adaptability  o  Creative thinking

o  Systems thinking  o  Organizational stewardship

Source: VHA.

Lesson learned: Analyze existing agency data on individual employee's
skills and competencies and information from performance appraisals to
help identify skills and competencies that need to be addressed throughout
the agency as well as on an individual basis.

To obtain a unit- or agencywide perspective of skills and competencies,
some agencies such as USACE and IRS have explored new ways of aggregating
data from tools that are primarily focused on individual employees, such
as IDPs, performance assessments, and 360-degree feedback instruments.9
Officials from these agencies told us that this information helped them
discern a clearer picture of the overall strengths and weaknesses of their
employees and offered direction in planning and designing training and
development programs to help focus efforts to enhance skills and
competencies throughout the agency.

USACE's automated training management program provides a Webenabled
integrated database

According to USACE officials, using an automated training management
program has allowed managers to identify divisionwide gaps in workforce
skills and competencies. Using this system (currently in four of USACE's
eight divisions) employees prepare an IDP assessing their knowledge,
skills, and abilities in relation to a series of mission essential tasks.
With supervisory guidance, each task is identified as critical, important,
or beneficial and employees indicate whether they have received adequate,
partial, or no training in that area. With this assessment as a guide, the
supervisor and employee can consult the system's built-in course catalog
to select internal or external training to enhance the employee's
development. In addition, the system also has the capability of
aggregating data. USACE officials said that this capability provides a
simple method for division managers to obtain a picture of the level of
skills and competencies in their workforce. This information informs
decisions on training priorities and helps managers determine the most
efficient use of available resources.

IRS aggregated data from 360-degree feedback instruments to help identify
training needs

To assess the progress and developmental needs of leaders within IRS, the
agency's leadership development office recently aggregated and analyzed
multiyear data from the 360-degree performance assessments of IRS
managers. This analysis helped to show areas of strength and weakness in

9 The 360-degree feedback process is designed to provide a manager direct
input from various sources-supervisor, peers, customers, and
subordinates-and to compare those results to a self-evaluation. With this
feedback, managers can identify action items and incorporate them into
their individual performance plans.

skills and competencies across the agency's managerial ranks. The director
of leadership and organizational effectiveness at IRS said that, in the
past, the agency did not sufficiently assist managers in effectively using
360-degree feedback they received. However, he said that IRS now
emphasizes the importance of using 360-degree feedback data, both on an
organizational and individual basis, to focus on strengths in developing
key leadership competencies. IRS's Extraordinary Leader Program involves
designing unique developmental approaches to help managers become more
effective leaders. Using results of the manager's 360-degree assessment,
IRS creates a customized leadership development program focusing first on
correcting any "fatal flaw" weaknesses and then building on the manager's
demonstrated strengths in areas that IRS has identified as key to
providing effective leadership within its organizational culture and
operating environment.

Lesson learned: Link the agency's workforce planning efforts with training
needs assessments to ensure consistency and enhance strategic alignment.

The agencies' training and development organizations had a range of
responsibilities, including designing training and development programs
based on strategic initiatives, soliciting input from stakeholders, and
prioritizing and scheduling training based on strategic initiatives and
stakeholder input. Generally, the training organization and agency
stakeholders can work together more effectively when they better
understand how each office or function within the agency contributes to
achieving business goals. In some cases, this included efforts to link
training needs assessments with the agency's overall workforce planning
efforts. Officials told us that this linkage helped ensure that workforce
plans developed by the agency's human capital office were consistent with
training needs assessments done by the agency's training and development
organization.

FWS involved key internal stakeholders in its planning processes

When assessing workforce skills and competencies, FWS officials worked to
ensure that the agency's workforce plan was linked with a training needs
assessment done by its training center. The workforce planning effort
identified broad competencies needed across the agency's workforce while
the training needs assessment identified the types of training courses to
develop skills and competencies within agency units and occupations.
Officials said that they viewed the training needs assessment as a tool
that

was useful in refining the agency's workforce plan and in prioritizing and
budgeting for the development and delivery of training. Both efforts
involved key stakeholders from the human capital and training offices as
well as other FWS units. In the future, officials said they will rely on
the results of the agency's workforce planning efforts to directly serve
as the agency's training needs assessment.

USACE relies on its Learning Advisory Board and automated training
management program to effectively link planning efforts

USACE relies chiefly on the coordination activities of its Learning
Advisory Board to ensure its workforce planning efforts and training needs
assessments are effectively linked. USACE in 2001 formed the Learning
Advisory Board, comprised of senior managers from across the agency, to
review the adequacy of USACE's training and development and ensure that
training is properly aligned with the agency's missions, goals, and plans.
In addition, the four divisions that use the automated training management
program can also rely on data from that system to assess training needs.
This system allows managers to compare information on individuals' skills
and competencies with workforce planning results from within the division
and across the agency. According to USACE officials, this systematic
comparison more closely links workforce planning and training needs
assessments to the essential mission-related operations.

Lesson learned: Consider the training needs of staff from other
organizations that will likely use the agency's training programs or
facilities to effectively leverage training investments and meet diverse
needs.

When planning and designing training programs for its employees, FWS and
USACE officials told us that they gained insight into the assessment of
potential training solutions by considering the possible involvement of
trainees from other organizations. In some instances, for example, it
would not have been cost-effective to design, develop, and deliver a
training effort for a small number of employees or occupations. However,
officials' determinations that other agencies or organizations also needed
similar training provided the critical mass needed to move forward.
Officials said they found that partnering with other organizations helped
make training efforts more cost effective to design, develop, and deliver.

FWS's training center assessed training needs of possible participants
from other Interior components

Officials from FWS's training center said that although they focus
primarily on meeting the needs of FWS employees when planning and
designing training, they also look at the training needs of other
organizations, particularly other agencies within Interior. To aid in
communication and coordination, agencies within the department designate
employees to serve as liaisons between each of the Interior agencies and
FWS's training center. These liaisons facilitate efforts to incorporate
their agency's needs into the training center's plans and the design and
delivery of training and development programs. According to agency
officials, these liaisons help ensure that the center's courses remain
current because they facilitate a dialogue between the various agencies.
For example, a recent FWS course on wetland plant identification involved
participants from FWS; U.S. Geological Survey, which is another Interior
component; Natural Resources Conservation Service, an agency of the U.S.
Department of Agriculture; USACE; and three private sector firms.

USACE's training center considers training needs of staff from other
federal and state organizations

Officials from USACE's training center told us that during the agency's
annual training needs assessment, they assess the possible training demand
from other organizations' employees in addition to identifying workforce
development needs of units and offices within USACE. According to USACE,
approximately 2,500 people each year-about 25 percent of the participants
in the agency's training center programs-are from other federal and state
organizations. Officials noted that the training center offers courses
needed to obtain certifications for certain professional requirements. It
offers courses accredited by several professional associations, including
the National Society for Professional Engineers, the American Institute of
Architects, and the International Association for Continuing Education and
Training. USACE officials said that they would like to offer training to a
greater number of employees from private sector firms; however, current
law requires receipts for services provided to private individuals and
organizations to be deposited into the general treasury as miscellaneous
receipts.10 USACE officials said that this requirement hampers the ability
of USACE's training center to keep funds it could generate to further
invest in its training programs. However, when agencies are required to
return receipts for services to the Treasury, the

10 31 U.S.C. 3302.

Congress is preserving its oversight and control over the programs
generating the fees.

Agencies Developed Strategies and Solutions for Their Training Needs

Agencies carry out their training and development efforts on the basis of
estimated needs, priorities, and available resources and recognize that
adequate planning allows them to establish priorities and determine the
best ways to leverage investments to improve performance. The five
agencies we reviewed set priorities for training and development on the
basis of various factors, such as the results from skill and competency
assessments, the availability of resources, and the interests of agency
leaders. They usually relied on training officials, agency managers, and
subject matter experts to assist in developing strategies and approaches
for addressing training needs. Although the agencies designed and
delivered training using both centralized and decentralized approaches, we
found that leadership development programs were more highly centralized
and managed at headquarters. Agency officials acknowledged that they found
projecting costs and benefits of proposed training and development
programs to be very challenging. Although they sometimes developed broad
information on anticipated benefits and expected costs, this often did not
involve tying anticipated benefits to specific performance improvements or
considering all related costs.

As outlined in figure 6, agencies can plan and establish priorities by
developing an annual training plan to target developmental areas of
greatest need and outline the most cost-effective training approaches to
address those areas. Considerations involved in assessing investment
opportunities for the training plan include balancing the competing
demands confronting the agency and the amount of resources available in
order to determine how those demands can best be met with available
resources. It is also important to consider how to effectively integrate
all of the strategies the agency plans to use to improve performance and
meet emerging demands. When training is identified as a solution to
improve performance, agencies can compare various training strategies by
weighing their estimated costs and anticipated benefits to build a
convincing business case that supports the selected training strategy.
Developing a business case that sets forth the expected costs and benefits
of the performance improvement investment provides decision makers with
essential information for allocating necessary resources.

Figure 6: Steps for Developing Strategies and Solutions for Training and
Development Needs

Align priorities with strategic Assess strategiesdirection

Consider evaluation Developing strategies and feedback and solutions for
training and

development needs Identify alternatives

0M sources 0M methods

Integrate strategies and Weigh costs and benefitssolutions

                                  Source: GAO.

Agencies' Experiences in Developing Strategies and Solutions for Training
Needs

The agencies we reviewed used a wide range of strategies and solutions to
improve performance through designing training and development programs
for their employees. Officials told us they considered a mixture of both
on-the-job and other developmental programs, contemplated an assortment of
mechanisms for delivering the training, and assessed potential sources to
meet their learning needs. For example, USACE's leadership development
program for midlevel engineers and scientists involved formal classroom
training, mentoring, and a 6-month developmental assignment. VHA employees
can access a wide variety of informational and educational content through
the VA Knowledge Network, a satellite-based system of live and on-demand
programming delivered directly to employees' desktops. IRS's training unit
developed an automated ROI workbook tool that the agency's business units
can use to assess whether proposed training programs should be delivered
in a classroom or by an e-learning approach. OPM partnered with an
employee

union to offer a midcareer development program that provided an
opportunity for current OPM employees to enhance existing skills, explore
new career fields, and gain practical experience. Officials from FWS's
training center cited courses in negotiation, communication, and
interpersonal skills as examples of vendor-provided courses. They said the
center decided not to invest in designing these courses since they do not
require field experience and expertise, which are critical in designing
other FWS courses.

Projecting costs and benefits of proposed training and development
programs was a challenge for the five agencies. They usually developed
broad information on anticipated benefits and expected costs, often
without tying anticipated benefits to specific performance improvements or
considering all costs related to the training program. For example, VHA
officials told us that the agency's assessments of anticipated benefits
and expected costs of proposed training are generally unsystematic. One
VHA office or field location may not have analyzed the relative costs or
benefits of proposed training while another office or location may have
considered anticipated benefits and developed estimates of costs and
savings using different training approaches. At FWS, some proposed
training programs, such as its Advanced Leadership Development Program,
involved detailed estimates of costs, both for the training center and for
participating FWS field offices, as well as the identification of specific
competencies to be developed in the program. Other proposed training
programs at FWS did not have documented and detailed estimates of expected
costs and benefits. FWS officials said that they assess the anticipated
costs and benefits of all their proposed training and development programs
but that the extent of these assessments and the amount of documentation
supporting the assessments vary, depending on many factors, such as the
content, delivery mechanism, and uniqueness of the proposed training.

According to agency officials we interviewed, limited funding sometimes
affected agencies' abilities to design and deliver training and
development programs that officials believed were needed. USACE officials
said, for example, that in a recent survey, about two-thirds of agency
supervisors and one-half of agency executives believed that the agency had
less funding for training civilian employees than is needed. Funding and
resource limitations sometimes forced the agencies to think of new and
practical ways to ensure that their employees had the knowledge and skills
needed to carry out their work. For example, because of an unexpected
decrease in available travel funds, VHA officials canceled plans for a
large national conference on the use on an automated managerial cost
accounting

system. This system was designed to provide VHA managers with data
important in making clinical decisions, managing workload, and controlling
medical care costs. Instead, to enable employees to obtain the information
that would have been presented at the conference, VHA officials provided
the content via a satellite broadcast along with a series of audio
conferences.

At FWS, the number of people that the training center can train onsite, of
course, is limited by the current capacity of the facility classrooms and
residences. According to FWS officials, incorporating a blended learning
approach into its curriculum has enabled trainees to perform part of the
course work outside class, thus allowing instructors to focus on those
topics that require special attention. Training center officials also said
that the agency has increased its use of e-learning and other mechanisms
to develop employees where they work instead of coming to the training
center. Officials told us that FWS has reduced training costs by offering
more training online and using CD ROMs to provide field offices with
course material that previously had been offered only on location at the
training center.

Agencies' Lessons Learned in Developing Strategies and Solutions for Their
Training Needs

Agency officials have encountered a variety of challenges in their efforts
to design training programs to meet the developmental needs of their
employees. How agencies respond to these challenges can greatly affect
their success in aligning priorities with strategic direction, assessing
strategies, identifying alternative sources and methods, weighing
potential costs and anticipated benefits, and assessing how other
performance improvement initiatives might complement training efforts. It
is also important to consider evaluation feedback on an ongoing basis. We
identified eight lessons learned related to the five agencies' efforts to
develop strategies and solutions for their training and development needs.

Lesson learned: Incorporate information on employees' various competency
levels and job needs into the design of training and development programs
to increase their relevancy and timeliness.

When designing effective training and development programs, the way the
work is actually to be done on the job and the developmental needs of the
expected trainees are key considerations. Analyzing the tasks of specific
jobs and occupations can help ensure that training accurately reflects the
way employees are expected to perform on a day-to-day basis. To help
ensure that the training effectively addresses employees' developmental

needs, agencies can determine the workforce's level of proficiency in
mission-critical skills and competencies by conducting skills assessments
and using information obtained through interviews or surveys of employees
and their supervisors. The increased information and insight provided by
these approaches can allow agencies to incorporate information on
employees' various competency levels and job needs into the design of
training and development programs--increasing both the relevancy and
timeliness of the learning.

FWS targeted training to employees to leverage specific knowledge and
experiences

FWS's training center recognized employees' various competency levels and
job needs when developing courses on the use of geographic information
systems (GIS), which are becoming increasingly important mapping and
information analysis tools for natural resources agencies, according to
FWS officials.11 Rather than providing instruction focused solely on
software features and functions of GIS, the training center designed its
curriculum to teach the application of GIS to employees based on their
roles in natural resources management. With a focus on these varied roles
and related needs, the center developed separate GIS training courses for
"explorers" (natural resource managers or others just wanting to know
about GIS), "users" (biologists and other personnel using GIS in their
daily job) and "developers" (those individuals designing and developing a
natural resources GIS for use by others). Officials at FWS's training
center said that with the training targeted to the specific background and
needs of employees, FWS can minimize the time spent teaching participants
information that they already know or do not need to know in carrying out
their job responsibilities. The training center's curriculum consists of
15 separate GIS courses, with 3 additional courses under consideration or
development. According to FWS officials, about 1,200 FWS employees at over
400 offices use GIS software in their jobs.

11 GIS is a system of computer software, hardware, and data used to
manipulate, analyze, and graphically present a potentially wide array of
information associated with geographic locations.

IRS used skills assessments to focus on developmental needs

To assist in identifying employees' competency levels and incorporating
job needs into training, IRS developed and used technical assessment
batteries for the agency's field assistance personnel and customer service
representatives. These multiple-choice instruments were designed to assess
the key technical knowledge that the employee needs in order to carry out
his or her job. On the basis of each employee's test results, the agency
will recommend specific training and other appropriate interventions, such
as mentoring, to improve performance. IRS officials told us that in some
cases where the assessments showed that individuals were already
knowledgeable in a particular area, employees still wanted to take the
related training because they viewed training as a job benefit as well as
a way to improve knowledge and skills for their jobs. This provides an
example of how important agencies' considerations of the organizational
culture and working environment when designing training and development
programs are in preparing for and addressing issues that may arise during
implementation. As we point out in our training guide, employees need to
not only understand the goals of agencies' training and development
efforts, but also to accept responsibility for developing their
competencies and careers, as well as for improving their organization's
performance.12

Lesson learned: Assess options for using other organizations' course
content, staff, services, or facilities when designing a new training and
development program in order to develop efficient and cost-effective
strategies.

When thinking about strategies and sources for the design of a new
training and development program, officials can potentially discover more
efficient and cost-effective approaches through the use of other
organizations' course content, staff, services, and/or facilities.
Adequate planning can help an agency in meeting the developmental needs of
trainees without overburdening the agency's training capacity or creating
excess capacity. Obtaining reasonable estimates of likely costs and
identifying potential obstacles of using others' training resources can
help agencies develop more informed perspectives on ways to effectively
leverage resources.

12 GAO-03-893G, p. 7.

IRS considered several options in designing a course for senior managers

In evaluating options for designing and delivering a new training course
for its senior managers, IRS considered various sources for such training,
including internal resources, contractor support, and partnerships. IRS
officials said that they considered internal resources to design the
training but quickly realized that the agency did not have sufficient
expertise. The officials also considered using a contractor but concluded
that the costs would be too high. Instead, IRS decided to partner with the
Federal Executive Institute, an OPM-sponsored training facility in
Charlottesville, Virginia, that provides training to senior employees from
across the federal government. IRS officials found that partnering IRS
design and subject matter experts with institute and other renowned
leaders in the field delivered the most cost-effective approach and
yielded the best results. The design team produced a course called
"Learning Through Others," delivered on the Charlottesville campus.
According to IRS officials, this course surpassed agency needs and
expectations and was less expensive than a direct contracting arrangement
with an outside vendor. They added that participants in the training
program could learn public service values through lessons and encounter
competency-based experiential learning, business-related challenges, and a
capstone simulation. IRS officials characterized the course as high
quality and said the prestige associated with studying at the Federal
Executive Institute provided an additional benefit for IRS participants.

Lesson learned: Establish mechanisms and controls to avoid unnecessary
duplication or inconsistency within and across agencies' training efforts.

The agencies used both centralized and decentralized approaches by, for
example, centrally managing reporting and record keeping while allowing
some localized management of training content. Whatever mix of centralized
and decentralized approaches is used, agencies recognize that it is
important to limit overlap and duplication and ensure the delivery of an
integrated message when appropriate. VHA and FWS officials found that
establishing mechanisms and controls is important to limit duplication or
inconsistencies within an agency, across component organizations within a
department, or across the federal government as a whole.

VHA's Employee Education System helped limit duplication of effort

VHA's Employee Education System, which serves as an internal training
consulting team within VHA, assists the agency's 21 regional networks in
designing and implementing programs to develop general and specific skills
for VHA employees. Within each of the 21 regional networks, an Education
Service Representative acts as a liaison in coordinating numerous
developmental programs with VHA headquarters-sharing information with
their counterparts about effective practices and identifying areas of
possible duplication or inconsistency across VHA. According to VHA
officials, the coordination and communication achieved through this
organizational structure has helped ensure consistency in implementing the
agency's national training priorities. For example, officials said that
the consulting team assisted in implementing changes to VHA's processes
for collecting third-party insurance reimbursements in the wake of
legislation that required VA to make greater efforts to collect unpaid
debts from veterans. They told us that these legislative changes enabled
local VHA facilities to receive these reimbursements, but also overwhelmed
the local billing and debt collection processes. To address the problem
and help ensure consistency across the agency, the training consulting
team participated in redesigning the processes for coding, billing, and
debt collection; trained the employees responsible for billing and debt
collection in the new processes; and created graphical representations of
the new processes and posted them throughout the agency's facilities to
aid employees in learning.

Interior's Training Directors Council facilitated communication across
departmental components

Interior used its Training Directors Council to facilitate communication
across the department's different bureaus, thus helping to minimize
duplicative training and development efforts. This council provides
opportunities-through formal meetings and informal communications- for
training managers from Interior's various bureaus to share curriculum and
related training ideas with their colleagues. The director of the training
center at FWS, who chairs the council, told us that on more than one
occasion he has discovered through council business that other Interior
components had developed strategies or solutions to address emerging or
existing needs that FWS's training center had also identified. For
instance, the training center at FWS was considering whether to add a new
course on grants management to its curriculum. Through its participation
in the council, FWS determined that another component agency, the Bureau
of Land Management, already offered grants management courses through its
National Training Center in Phoenix, Arizona. After reviewing the content

of these Bureau of Land Management courses, officials at FWS's training
center determined that they did not need to design and develop a separate
grants management course. Instead, FWS and the Bureau of Land Management
now jointly manage the delivery of this training course.

Lesson learned: Develop and use criteria for determining the optimal mix
of delivery mechanisms to use in order to select the most effective
approaches given each learning situation.

In response to emerging demands and the increasing availability of new
technologies, agencies are faced with the challenge of choosing the
optimal mix of training delivery mechanisms to design training that is as
effective and efficient as possible. Agency officials consider a wide
variety of instructional approaches to achieve learning-in the classroom,
through distance learning, or through structured on-the-job experiences.
Officials also took other factors into account, including whether to
provide individualized instruction or team-based training and when to use
blended learning that combines different teaching methods (e.g., Web-based
and instructor-led) within the same training program. USACE found that
identifying and systematically using criteria to help select effective
delivery mechanisms assisted in building well-supported justifications for
the design of training and development programs.

USACE used criteria to select media and method of instruction

To select the appropriate media and method of instruction for its training
programs, USACE uses criteria contained in the Corps of Engineers Systems
Approach to Training, the agency's documented process for developing
training programs. The criteria include issues such as the expected
frequency of changes to the training content, the size and diversity of
the target population, and the degree of student interaction required.
USACE officials told us that, using these criteria, course managers from
USACE's training center coordinate with relevant agency program offices
and subject matter experts to decide on the appropriate mode for training
delivery. While most of the training center's courses occur in a
conventional classroom setting, agency decision makers have focused on
trying to identify courses (or modules of courses) to convert from
classroom training to more economical modes of delivery, such as distance
learning, computer-assisted instruction, computer-based instruction, or a
combination of such approaches. USACE officials said that many of their
courses now incorporate CD ROM and Internet-based

materials as prework assignments before attending classroom training and
for reference use during and after the training events.

VHA used a profiling tool to help in selecting delivery approaches

VHA's internal training consulting team used a training delivery strategy
tool that consists of a series of questions structured to guide users
through the process of selecting an appropriate delivery approach for a
proposed training effort. This team designed the training delivery
strategy tool to help staff plan, analyze, develop, and deliver training
and development activities. To aid in decision making, the tool includes
factors such as audience composition, course goals and objectives, course
modules, any prerequisites, participant preparation, and course follow-up
and evaluation. VHA officials said that applying this tool and analyzing
the resultant profile helps maximize learner understanding, retention, and
application.

Lesson learned: Ensure that employees have the needed equipment and
technologies so that they can take maximum advantage of learning
opportunities.

Many organizations are taking advantage of more flexible design and
delivery methods made possible by technology to, for example, deliver
training to the user's desktop, thereby making training more accessible
and cost effective. As agencies move forward in using new approaches, it
is important to ensure that employees have the needed equipment and
technology to take maximum advantage of learning opportunities.

IRS converted mandatory training courses to an online format

IRS recently converted a series of mandatory training courses from
face-toface group briefings to an online format in order to more
efficiently provide this training to its employees. Through these online
mandatory training programs, which include computer security awareness,
ethics issues, and prevention of sexual harassment, IRS wanted to (1)
reduce the burden of managers who previously had to prepare for and
deliver the training, (2) provide ready access of the information to line
employees when and where they need it, and (3) lower the costs associated
with the group briefings. IRS officials said that online delivery lessened
employee time in taking the training from approximately 6 to 2 hours and
in some cases eliminated the need for travel. To ensure the accessibility
and usability of these online briefings, IRS worked to resolve various
challenges in the

conversion, such as designing the online product for the lowest computing
capabilities of the bulk of the trainee population and providing
alternative delivery mechanisms for individuals who were without a
computer or Intranet access. The officials said that they also learned it
is important to design the online briefings based on a common template and
style guide to standardize their look and feel, provide online text-only
versions of the training for persons who are visually impaired, and test
the usability of the online briefings with end users on a range of
equipment and allow sufficient time for needed revisions.

OPM initiated a pilot program for its employees to use online courseware

To explore opportunities for increased use of e-learning approaches, OPM
established a pilot program that allowed about 250 of its employees access
to approximately 1,800 online courses through the Department of
Transportation's Transportation Virtual University. OPM's training unit
worked with offices throughout OPM to identify employees to participate in
the pilot program. OPM officials said that all employees selected for the
pilot program had access to the equipment and technology needed to make
use of the University's online training, by using a computer either at the
employee's desktop or at some central location. As a result of this pilot,
OPM enhanced its offerings of online training for its employees by
becoming a partner in the GoLearn.com initiative, a governmentwide online
training center for federal employees. OPM officials said the pilot
program also showed that not all employees function effectively in an
online training environment and that some employees need a more structured
format in order to learn.

Lesson learned: Plan early when developing integrated solutions that
complement other planned and ongoing strategies to improve performance so
that when implemented the strategies work effectively and are aligned to
achieve agency goals.

When designing training and development programs, the agencies sometimes
considered how they could integrate them with other strategies to improve
performance and meet emerging demands. If the work environment is not
conducive to providing opportunities to use new skills or work in
different ways, no matter how good the training program is it may not be
effective or successful in terms of changing on-the-job performance. In
addition, training and development programs represent a significant
investment of resources (including time and money) and may

not always be part of an appropriate solution. The agencies developed
integrated solutions that included developing and using job aids,
performance support tools, and other approaches to enhance knowledge
management13 and to aid employees on the job as a complement to training.

OPM planned for an electronic support tool to aid agency employees in
using a new computer system to process retirement claims

As part of its effort to reengineer and modernize its processing and
support of federal employees' retirement claims, OPM is developing plans
for an integrated Electronic Performance Support System to aid the
agency's benefits specialists in using a new computer system. Procedural
and information job aids are to be built directly into the software to
provide documentation and guidance, "just in time" assistance, and error
detection. This is intended to be an integrated system to permit
coordination between different modes of training and enhance the learning
and performance of the OPM employees working with reengineered business
processes and the new computer system. OPM officials said, for example,
that this system would assist employees in completing steps using actual
data and circumstances of a particular case they were working on rather
than consulting a manual or using data put together just for training.
According to OPM officials, as the focus under modernization shifts from
processing claims to providing customer service, this system will help
employees working in OPM's retirement program to interact more directly
with program participants to answer questions and solve problems about
retirement issues.

USACE identified online solutions to help enhance and integrate training
efforts

As a complement to the training and development programs it offers to its
employees, USACE recently entered into a joint project with the Department
of Labor to use an online knowledge management system called Workforce
Connections. This system, which resulted from a memorandum of
understanding promoting cooperative efforts between the departments of
Defense and Labor, will provide the USACE workforce with on-demand, online
access to job aids, performance support materials, and

13 Knowledge management is an approach to capturing, understanding, and
using the collective body of information and intellect within an
organization to accomplish its mission.

course content 7 days a week, 24 hours daily. The system will feature
development and maintenance of online communities of practice to support
knowledge management of USACE's Learning Network, which is USACE's overall
platform for delivering a wide variety of learning resources to agency
employees.14 Another part of the learning network is USACE's Virtual
Campus, a distance learning site that allows employees access to Web-based
courses and training events. Another component of the learning network
includes electronic performance support tools, such as job aids and other
information resources. USACE officials said that they consider the systems
in the learning network to comprise a powerful solution that effectively
integrates the agency's training efforts.

Lesson learned: Plan for the direct participation of senior agency leaders
and experienced staff in the delivery of training and development programs
to increase buy-in and build support for organizational change.

Internal resources, such as subject matter experts and high performers,
can often provide valuable insight into training design because of their
familiarity with the agency's policies, programs, and corporate culture.
To increase buy-in, help establish greater credibility, and build support
for organizational change, the agencies have learned the value of planning
for the direct involvement of senior managers in the training program.

IRS and OPM involved executives and managers

IRS officials told us that a key feature of the agency's frontline
managers course is that it was designed to use senior managers and
experienced frontline managers drawn from the agency's business units to
teach the course. In addition, executives participate in course modules
that focus on emerging issues facing the agency. The deputy commissioner
of IRS's Wage and Investment business unit served as the executive sponsor
of the training program and participated in course modules featuring
executives. The officials also told us that IRS executives partner with
outside vendors to serve as an instructor team to deliver all courses
designed for senior managers. Using business unit executives and managers
as course instructors helped ensure that the course's content and emphasis
related to the mission, goals, and guiding principles of the agency.

14 Communities of practice provide an on-line resource for peers to ask
and respond to questions and share knowledge.

At OPM, agency managers have been the first to take special initiative
training-such as courses on prohibited personnel practices, whistleblower
procedures, and information technology security-before they are offered
agencywide. OPM officials said that teaching the material to agency
managers in advance of line employees enables the managers to model
desired behaviors and learning for their employees and convincingly convey
how they personally benefited from the training.

FWS and VHA relied on in-house experts

FWS's training center brings in FWS field office personnel when building a
cadre of senior, in-house instructors. Training center officials said that
involving trainers from the field helps to build trust with trainees and
provides an added level of credibility that neither academics nor other
subject matter experts who lack field experience can easily replicate.
According to these officials, many expert employees come from the field
and stay to teach at the center for 3 or 4 years. Some, however, teach
only one or two courses or get involved for a short duration before
returning to their positions in the field. The director of the training
center said that he views this passing-on of information from seasoned
veterans to less experienced employees as crucial for maintaining the
unique knowledge base of the agency.

VHA used "super users" to teach medical center personnel to use its
computerized patient record system, a computer interface that allows
hospital personnel to keep more comprehensive patient records and enables
clinicians, managers, and other staff to review and analyze data gathered
on any patient. The super users-VHA employees with other jobrelated duties
and responsibilities-were trained to be thoroughly knowledgeable about the
system so they could demonstrate its capabilities and directly relate the
training to employees' work. VHA initiated this strategy when the agency
began rolling out the application in 1997. VHA officials said that
planning to build on the direct involvement of these super users was
successful because they served as first-line resources for employees'
questions about the new system and helped the agency to build
organizational support for the system. According to VHA, the agency
developed a cadre of more than 2,500 super users, and about 180,000 VHA
employees use the patient record system.

Agencies are Considering More Sophisticated Evaluation Approaches As Part
of Designing their Training and Development Programs

Without evaluation of training programs, participants may take ineffective
courses that do not provide the necessary learning experience or that do
not translate to improved performance on the job. Overall, the five
agencies in our review relied primarily on standard end-of-course
evaluations to obtain the participants' reaction to, and satisfaction
with, a specific training course or learning opportunity. Although the
agencies encountered challenges given some of the difficulties associated
with measuring the impact of training on individual and organizational
performance, they have begun or are planning to use more comprehensive and
sophisticated evaluation techniques for assessing their training and
development efforts. Such techniques include the use of pre- and
posttesting to determine the extent of learning accomplished, tracking the
performance or advancement of individuals and work units before and after
training is completed to assess professional growth and improvements in
organizational performance, and limited use of ROI analyses to compare the
benefits (quantified in dollars) with the costs of a training and
development program.

To help determine whether the objectives of training and development are
achieved, agencies can begin by incorporating measures of effectiveness
into the design of training and development programs. Defining objectives
in a measurable way enables agency officials to offer a more convincing
business case and contributes to improving the quality of feedback.
Whenever possible, training goals should measure the individual and
organizational results achieved rather than the training inputs or outputs
(e.g., number of available courses or people trained). Figure 7 depicts
some of the steps involved in determining the evaluation methods to use in
designing training and development programs.

Figure 7: Steps in Determining Methods for Evaluating Training Programs

Agency officials recognized the importance of determining during design
how they planned to evaluate the effectiveness of their training and
development programs. To collect information on participants' reaction to
and satisfaction with the training program, for example, VHA uses a
standard evaluation form with questions related to program design,
delivery, outcomes, overall satisfaction, and logistics. In completing
this survey, training participants evaluate their success in completing
learning objectives and the performance of the faculty. Other agencies
also obtained participant feedback though interviews or focus groups. OPM
conducted exit interviews with individuals who participated in that
agency's Presidential Management Intern program, a 2-year developmental
program for individuals from a wide variety of academic disciplines who
wish to enter the federal service. These exit interviews were designed to
obtain feedback from the participants about their overall satisfaction and
reaction to the Presidential Management Intern program, including their
suggestions for enhancing the training provided.

                                  Source: GAO.

Agencies' Experiences in Determining Methods for Evaluating Their Training
and Development Programs

The five agencies have begun to use, or are planning to use, more
comprehensive and sophisticated evaluation techniques to assess the extent
to which training and development programs increased employees' knowledge
and skills or enhanced individual and organizational performance. One of
these more sophisticated evaluation techniques is the use of pre-and
post-testing to determine the extent of learning during the training
program. USACE's training center conducts pre- and post-tests on over 90
percent of the courses it offers and is working toward the goal of using
such tests for all courses. The agencies also tracked job performance and
the advancement or movement of personnel to assess the potential
effectiveness of training. FWS officials told us they track participants'
career advancement to determine the extent to which participation in the
leadership development program for midlevel employees contributed to
increased mobility into more responsible leadership positions in the
agency. According to training center officials, about 37 percent of the
program graduates have taken either promotions or new lateral assignments
since the program's inception in January 2002.

In addition, some agencies attempted to conduct ROI analyses to compare
the benefits (quantified in dollars) to the costs of a particular training
and development program. VHA officials pointed to concerted efforts to
conduct ROI analyses on several training and development programs,
including customer service, leadership development, and computer-based
training. IRS officials, on the other hand, have decided that the
challenges and difficulties in conducting such analyses are not worth the
effort for the resultant information-given the challenge of isolating the
performance improvements that might result from a specific training
activity and the difficulty in monetizing identified benefits in order to
calculate the ROI. IRS instead uses the concept of "time to capability" to
determine whether and to what extent a training course, program, or other
training intervention has improved the organization's ability to perform
its mission successfully. IRS defines time to capability as the validated
accumulation over time of employees who have been trained in specific
competencies deemed critical to the success of an organizational unit.
Under this approach, when IRS has trained a predetermined number of
employees, officials consider that the agency has achieved the goal of
training a critical mass within its workforce and conclude that the agency
has an organizationwide capability in the specific competencies.

Agencies' Lessons Learned in Determining Methods for Evaluating Their
Training and Development Programs

Agencies' training and development efforts involve a continuous effort
throughout planning, design, implementation, and evaluation. Therefore, it
is important to recognize that evaluation is not a static requirement to
be carried out after the fact. When undertaking design and development of
training, agencies can rely on evaluations and benchmarking to determine
what approaches work best given all the related elements, such as the
proposed audience for the training program, the material to be covered,
and possible delivery mechanisms that could be employed. Determining
methods for evaluating training programs as part of their design can help
identify and remove obstacles to successful implementation. For example,
an agency officials said that catching potential problems early on saved
valuable time and resources that a major redesign of training later on
likely would have entailed. On the basis of our review at the five
agencies, we identified four lessons learned regarding the agencies'
efforts to determine methods for evaluating their training and development
programs.

Lesson learned: Incorporate appropriate aspects of the evaluation approach
when designing training and development programs by specifying what
results are expected to better ensure the availability and use of quality
performance data.

In assessing how and to what degree performance could be improved with a
specific training program, agencies should try to establish a targeted
level of improved performance as well as assess the possible consequences
if the training were not to occur. Determining a target level for improved
performance can aid agencies in assessing whether the expected costs
associated with the proposed training are worth the anticipated benefits.
Agreeing upon the planned evaluation approach in the design clearly sets
forth the results the agency expects to achieve through the training. In
addition, planning ahead helps ensure the availability and use of sound
and relevant performance data.

VHA decided to evaluate a customer service training program after it was
implemented

In response to low scores on customer satisfaction surveys, a VHA regional
network office pursued various initiatives to improve customer service,
including the design, development, and implementation of a new training
program focused on creating a more customer-service-oriented culture and
improving employee morale and collaboration to better meet customer needs.
After delivery of this new training program, called "The Customer," VHA
selected a contractor to evaluate the effectiveness of the training

effort. In its report, the contractor stated that because the opportunity
to conduct the evaluation did not occur until after the training program
had been delivered, the use of preferred evaluation methods was not
possible. The contractor reported that this lack of preprogram planning
had also been experienced in some other VHA network offices. Nevertheless,
the contractor evaluated the customer service training program by
surveying participating employees and their supervisors. From these survey
results, the contractor concluded that the training program was viewed as
successful for those who needed it, but that the training did not receive
a sufficient level of support from supervisors. The contractor recommended
that VHA obtain additional feedback from supervisors as well as from an
individual hired to telephone or visit the VHA facilities in the network
office to observe customer service activities.

Lesson learned: Consider new approaches for collecting and analyzing
performance data with the aim of increasing the quality and quantity of
training evaluation feedback.

The agencies faced various challenges in obtaining a high quality and
quantity of feedback needed to evaluate their training and development
programs. We previously reported that low participation on the part of
employees and managers in surveys and focus groups may limit an agency's
access to the data needed to complete valid and useful evaluations of
training programs.15 With strong agency support and proper planning,
stakeholders, including training participants, supervisors, managers, and
trainers, are more likely to provide the information and feedback needed
to successfully and effectively evaluate agency training and development
programs. USACE, for example, recognized that it needed to ensure that it
incorporated a wider variety of stakeholder perspectives in assessing the
impact of training on employee and agency performance. Stakeholders'
perspectives can be obtained through surveys and questionnaires,
individual or group interviews, or communication with more formal
multidisciplinary bodies such as advisory or education councils. Valuable
sources of information include the training participants; training
designers, developers, and facilitators; agency leaders, managers,
supervisors, subordinates, and coworkers; employee organizations; internal
and external customers; and functional and subject matter experts.

15 U.S. General Accounting Office, Human Capital: Design, Implementation,
and Evaluation of Training at Selected Agencies, GAO/T-GGD-00-131
(Washington, D.C: May 18, 2000).

USACE's training center altered its approach to obtaining supervisory
feedback on its training efforts

USACE's training center has been attempting to obtain more sophisticated
evaluation feedback to determine if its training courses affected
employees' behavior on the job. Training center officials told us that
they originally had planned to evaluate all of the center's training
courses by obtaining feedback from employees and supervisors 6 months
after the training course on the extent to which employee on-the-job
behavior had changed. In a test of this approach, the training center sent
out hard copies of two generic survey forms: one for the employee and one
for the supervisor. Because the training center had locator information
for trainees but not for their supervisors, both surveys were mailed to
the employee, who was then asked to forward one to his or her supervisor.
Training center officials said the response to this survey effort was
disappointing. For one 35-person class, for example, the center received 3
employee surveys and 1 supervisor survey. To increase the level of
feedback, officials told us that the center is transitioning to an
electronic process whereby the survey forms are sent via e-mail to the
employee, who is then requested to forward the survey via e-mail to his or
her supervisor. Training center officials said that in a recent test of
this new approach on one course, the center received a 67 percent response
rate from employees and a 36 percent response rate from supervisors. The
officials told us that they are working to develop unique evaluation forms
for each course in order to obtain feedback on specific learning
objectives rather than rely on a generic survey form for all courses.

Lesson learned: Plan for the use of multiple data types and sources to
provide a balanced approach in assessing the effectiveness of training and
development programs.

Successful organizations typically develop and implement human capital
approaches based on a thorough assessment of the organizations' specific
needs and capabilities. Valid and reliable data are the starting point for
such assessments. To assess the results achieved through training and
development, agencies can rely upon hard (quantitative) data, such as
productivity/output, quality, costs, and time, or soft (qualitative) data,
such as feedback on how well a training program satisfied employees'
expectations. By taking steps to agree on measures of success up front,
agency officials can decide on the objectives for each training and
development program. Using a balanced approach that reflects feedback from
customers and employees, as well as organizational results, is

particularly important as agencies transform their cultures and
operations. In addition, because the work of federal employees can be
complex and often cannot be reduced to a single task, a balanced approach
to both the types and sources of data helps to strengthen the linkages
between training and development programs and improved performance.

USACE conducts evaluations of some training courses

In addition to obtaining end-of-course participant feedback and
administering pre- and post-tests for many of its courses, USACE's
training center also conducts course evaluations of its offerings. To
conduct this evaluation, the designated course manager from USACE's
training center is responsible for observing the full course and assessing
various aspects of its design and delivery, including the training
content, materials, and instructors. Training center officials said that
various factors can trigger the decision to conduct an overall course
evaluation, including a significant decline in trainees' overall
satisfaction with the course, the introduction of new instructors, the use
of contractor assistance, or a specific recommendation from an agency
office or unit. According to training center officials, the results of
this course evaluation are assembled with the end-of-course participant
survey feedback and pre-and post-testing results to present a
comprehensive and balanced view of the effectiveness of the training
program.

Lesson learned: Take into account all relevant factors for determining the
costs of a training and development program to better ascertain whether it
is cost-effective in relation to benefits achieved.

Calculating the ROI for a training program involves identifying and
monetizing the program's benefits and then dividing this by a full
tabulation of the program's costs. These costs should usually include the
cost of program materials provided to each participant; the cost of the
facilities; the costs of the facilitator or instructor, including time for
both preparation and delivery; any travel-related expenses for
participants; salaries and benefits costs of the participants for the time
they attend the program; and an allocation of relevant administrative and
overhead costs.

VHA's evaluation of a leadership development program did not include all
costs

In cooperation with VHA's internal training consulting team, one of VHA's
regional network offices designed, developed, and implemented a
networkwide leadership development program called "Competency Development
for Leaders in the 21st Century." According to information we gathered
during our review, the costs incurred for the consulting team's efforts on
this training program were not included in the ROI calculation even though
it contributed substantially toward developing and implementing the
program. VHA officials told us that these costs were not included in the
analysis because the course designers only wanted to determine the return
on the network's investment, not the agency's overall investment. Agency
officials said that one of the main goals of involving the consulting team
was to help the network develop the capability to use the ROI process to
evaluate training. Although these costs were not included in the ROI cost
tabulation, VHA did include the salaries (plus employee benefits) of the
participants for the time they attended the developmental program. As we
noted in our recently issued assessment guide, agencies might overlook the
costs of participant attendance when calculating the total costs of a
training program.16

Conclusions and Observations

Federal agencies are faced with the need to invest resources wisely to
ensure that their employees possess the information, skills, and
competencies required to carry out their work successfully. The examples
provided in this report may help to address this need by describing some
of the experiences and lessons learned that other agencies might find
applicable or adaptable to their unique situations. This information is
intended to both provide a realistic perspective on how agencies have
approached designing their training and development programs to date as
well as to take a more detailed look at some of the concepts explored in
our recently issued assessment guide focused on strategic training and
development efforts in the federal government.

Our work reviewing the selected agencies' efforts to design training and
development programs reinforces the significance of good planning and
design of these programs to ensure their successful implementation and

16 GAO-03-893G, p. 69.

evaluation. The experiences and lessons learned we identified also
demonstrate how effective design efforts-as part of a strategic training
and development process-rely on the eight core characteristics that we
identified in our earlier work: (1) strategic alignment, (2) leadership
commitment and communication, (3) stakeholder involvement, (4)
accountability and recognition, (5) effective resource allocation, (6)
partnerships and learning from others, (7) data quality assurance, and (8)
continuous performance improvement. Indeed, by focusing on these eight
core characteristics, agencies can improve not only the design of their
training and development efforts but also the planning, implementation,
and evaluation of their programs to better ensure that their employees
have the information, skills, and competencies needed to carry out their
work successfully.

Agency Comments and Our Evaluation

We provided a draft of this report on December 12, 2003, to the Secretary
of Defense, the Secretary of the Interior, the Commissioner of Internal
Revenue, the Director of OPM, and the Secretary of VA. Interior, IRS, and
OPM provided written comments on the draft report. In his written comments
(see app. IV), Interior's Assistant Secretary for Fish and Wildlife and
Parks generally agreed with the report's findings regarding the Department
and the FWS. He said that the report provides important examples that can
help the Department continue to move forward with additional confidence in
its actions. In his written comments (see app. V), the Commissioner of
Internal Revenue said that IRS was honored to share some of its lessons
learned with us for governmentwide dissemination. He said that our review
also provides IRS with practices from other agencies to assist IRS in its
efforts to continually improve its programs. In her written comments (see
app. VI), the Director of OPM said that she appreciated the opportunity
for OPM to be included in the report and to share information on OPM's
training and development activities and programs. Interior, IRS, and OPM
also provided technical comments to clarify specific points regarding the
information presented in the draft report, which we have incorporated as
appropriate. In comments by E-mail through its GAO liaison, VA agreed with
the information presented regarding the Department and had no additional
comments on the draft report. USACE informed us that it had no comments on
the draft report.

As agreed with your office we are sending copies of this report to the
Ranking Minority Member, Subcommittee on Oversight of Government

Management, the Federal Workforce and the District of Columbia, Senate
Committee on Governmental Affairs; the Chairmen and Ranking Minority
Members of the Senate Committee on Governmental Affairs and the House
Committee on Government Reform; and other interested congressional
parties. We are also providing copies to the Secretary of the Army, the
Secretary of the Interior, the Secretary of the Treasury, the Director of
OPM, and the Secretary of Veterans Affairs. This report is available to
others upon request. In addition, the report is available at no charge on
the GAO Web site at http://www.gao.gov.

If you have any questions about this report, please contact me on (202)
5126806. Key contributors to this report are listed in appendix VII.

Sincerely yours,

George H. Stalcup Director, Strategic Issues

Appendix I

                        Objective, Scope and Methodology

The objective of this review was to provide information on selected
federal agencies' experiences and lessons learned in key aspects of
designing training and development programs for their employees.
Specifically, we focused on the agencies' experiences and lessons learned
related to

o 	assessing current and future agency skill and competency requirements
and identifying related training and development needs,

o 	developing strategies and solutions for training and development needs,
and

o 	determining methods to evaluate the effectiveness of training and
development programs.

For this review, lessons learned were defined as knowledge that could be
applied in the future that the agencies gained through either positive or
negative experiences.

To address this objective we focused on five agencies: the U.S. Army Corps
of Engineers (USACE), Department of Defense; the U.S. Fish & Wildlife
Service (FWS), Department of the Interior; the Internal Revenue Service
(IRS), Department of the Treasury; the U.S. Office of Personnel Management
(OPM); and the Veterans Health Administration (VHA), Department of
Veterans Affairs. We chose the five agencies for a variety of reasons,
including the diversity of employee occupations within the agencies,
reported innovative approaches for training and developing their
employees, and congressional requester interest. We selected USACE to
obtain a Department of Defense perspective in light of that department's
reputation as a leader in the area of training and developing military
personnel. We included FWS and VHA to obtain information related to a
broad mix of employee occupations. We selected IRS because of reported
innovative approaches to training and development and included OPM because
of its role as the federal government's human capital agency.

To obtain information and related documentation, we visited the following
locations:

o 	USACE's headquarters human resources directorate in Washington, D.C.,
and its Professional Development Support Center in Huntsville, Alabama.

Appendix I
Objective, Scope and Methodology

o 	FWS's headquarters human resources division in Arlington, Virginia, and
its National Conservation Training Center in Shepherdstown, West Virginia.

o 	IRS's strategic human resources division in Arlington, Virginia, and
the small business and self-employed business line.

o 	OPM's headquarters human resources office and the career development
branch of the center for retirement and insurance service.

o 	VHA's Employee Education System headquarters in Washington, D.C., and
the medical center and local network offices in Durham, North Carolina.

It is important to note that our methodology was not designed to identify
examples that would be representative of all training and development
efforts at the five agencies in our review or of the government as a
whole. We did not verify the accuracy and reliability of the data provided
to us or the systems used to produce the information. Further, in citing
training and development programs as examples in connection with lessons
learned, we did not assess the effectiveness of the training programs and
practices. Rather, our intent was to highlight and briefly describe some
experiences and lessons learned that agency officials believed helped each
agency improve or enhance its training and development programs.

To obtain information about the five agencies' experiences and lessons
learned related to designing training and development programs, we

o 	Interviewed agency human capital and training officials and subject
matter experts responsible for agency training, performance, and other
initiatives; and

o 	Reviewed and analyzed agency documents such as workforce plans,
analyses, and reports; strategic, performance, and succession plans and
reports; organizational, occupational, and unit-based competency
standards; knowledge and skills inventories; skills gaps assessments;
competency and skill assessments; surveys of agency employees; training
plans and proposals; workforce demographic data; budget data; evaluation
plans and reports; and performance measures.

The lessons learned we identified for inclusion in this report were based
on (1) their linkages with one or more of the eight core characteristics
of a

Appendix I
Objective, Scope and Methodology

strategic training and development process, which we had identified in our
previous work (see app. III) and (2) sufficient evidence from the agency
to support the experiences that they relayed to us. At the exit
conferences for the five agencies, we presented agency officials with the
list of lessons learned that we had identified and wished to attribute to
their experiences. At that time, we also informed each agency of the
specific examples from their experiences that we would likely attribute to
these lessons learned. In these meetings, agency officials expressed no
objections to the lessons learned we had identified and in some cases
provided additional information to support specific examples from their
experiences that we proposed to use for this report.

We conducted our audit work between August 2002 and November 2003 in
accordance with generally accepted government auditing standards.

Appendix II

Background on Selected Agencies and Their Training and Development
Functions

The following summarizes key information on the five agencies included in
this review. These summaries include information on the agencies'
missions, organizational structures, and training and development
functions.

U.S. Army Corps of Engineers

The U.S. Army Corps of Engineers (USACE), part of the Department of the
Army within the Department of Defense, is comprised of approximately
34,600 civilian and 650 military men and women. USACE has a diverse
workforce consisting of military and civilian engineers, biologists,
geologists, hydrologists, natural resource managers, and other specialists
who work in engineering and environmental matters. USACE's mission is to
provide engineering services to the nation: (1) planning, designing,
building and operating water resources and other civil works projects, (2)
designing and managing the construction of military facilities for the
Army and Air Force, and (3) providing design and construction management
support for other Department of Defense and federal agencies. USACE
headquarters office is located in Washington, D.C.

USACE's Professional Development Support Center, located in Huntsville,
Alabama, serves as the center of learning and training for the agency. The
training center manages and implements the Proponent-Sponsored Engineer
Corps Training program, which provides job-related training through
technical, professional, managerial, and leadership courses for USACE and
other government agencies. USACE's training center offers more than 200
courses covering topics that support the agency's mission.

U.S. Fish and Wildlife Service

The mission of the U.S. Fish and Wildlife Service (FWS), a component of
the Department of the Interior, is working with others to conserve,
protect, and enhance fish, wildlife, and plants and their habitats for the
continuing benefit of the American people. FWS's headquarters is located
in Washington D.C., while its field units are located throughout the
United States. FWS employs more than 9,600 people and is supported by a
volunteer force of 29,000. Nearly 90 percent of FWS employees work in
field locations.

The National Conservation Training Center, located in Shepherdstown, West
Virginia, is FWS's training center and is responsible for training a wide
range of employees in the conservation community and serves as a gathering
place where conservation professionals from government, nonprofit
organizations, and corporations work toward common goals.

                                  Appendix II
                   Background on Selected Agencies and Their
                       Training and Development Functions

Training for FWS's law enforcement personnel is primarily conducted
through the Federal Law Enforcement Training Center, an interagency law
enforcement training organization headquartered in Glynco, Georgia.

Internal Revenue Service 	The Internal Revenue Service (IRS) is a branch
of the U.S. Department of the Treasury. IRS's mission is to provide
America's taxpayers top quality service by helping them understand and
meet their tax responsibilities and by applying the tax law with integrity
and fairness to all. IRS's organizational structure includes the following
business units: four operating divisions organized around four major
customer segments (Wage and Investment, Small Business/Self-Employed,
Large and Mid-Size Business, and Tax Exempt and Government Entities); four
functional divisions (National Taxpayer Advocate, Appeals, Criminal
Investigation, and Communications and Liaison); and two shared
services/support divisions (Agency-Wide Shared Services and Modernization
and Information Technology Services). As of March 2003, IRS had about
116,300 employees.

IRS takes a decentralized approach to training and developing its
workforce. Each business unit has an embedded human resources component
that provides advice and analysis on related policies and issues and
formulates strategies, procedures, and practices to address the unit's
human capital needs. Learning and Education, one of eight major divisions
comprising IRS's Office of Strategic Human Resources, provides guidance
and sets policy and standards on training and development for the agency's
business units and headquarters offices.

U.S. Office of Personnel Management

The U.S. Office of Personnel Management (OPM), the federal government's
human capital agency, provides human resources policy leadership,
technical advice and assistance, and products and services to federal
agencies, employees, annuitants, and job seekers. It also oversees
governmentwide compensation and performance management systems, and
provides retirement, health benefit, and other insurance services to
federal employees, annuitants, other beneficiaries, and agencies. In March
2003, OPM completed a major restructuring process through which it
consolidated various agency functions. As of March 2003, OPM employed
approximately 3,500 people, many of them stationed in agency headquarters
in Washington, D.C. OPM has a field presence in 16 major U.S. cities as
well as operating centers in Pennsylvania and Georgia.

                                  Appendix II
                   Background on Selected Agencies and Their
                       Training and Development Functions

OPM's training and development efforts are largely decentralized to the
agency's various program and staff offices. The employee training and
development unit within the agency's human capital management office is
responsible for setting overall strategy and for planning and implementing
agencywide training such as leadership development programs and various
mandatory training programs. According to OPM, the agency's newly
established Chief Human Capital Officer plays a significant role in
advising the OPM Director on overall employee training and development
initiatives and programs, as well as the establishment of the agency's
training budget.

                         Veterans Health Administration

The Veterans Health Administration (VHA), one of three major
administrations within the Department of Veterans Affairs (VA), is
responsible for providing primary care, specialized care, and related
medical and social support services to veterans through an integrated
health care system. VHA administers its functions through a group of 21
regional network offices located around the United States. As of March
2003, VHA employed about 203,500 people out of a total VA workforce of
about 225,000 employees.

VA takes a decentralized approach to training and development operations.
VA's human resources office provides advice and guidance on training to
VHA and the other departmental components but delegates training and
development operations to each component. VHA's organization includes the
Employee Education System, which is an internal training consulting group
that provides educational services that support the workforce development
and continuing education needs for VHA employees. This internal consulting
group of about 300 individuals primarily helps to assess agency training
needs at the national level, while VHA network offices and medical centers
take lead responsibility for assessing their own local needs. These
internal training consultants are available to assist VHA network offices
and medical centers in designing, developing, implementing, and evaluating
training and development programs to meet these local needs.

Appendix III

Core Characteristics of a Strategic Training and Development Process

The following summarizes the eight core characteristics that make a
training and development process effective and strategically focused on
achieving results. We identified these core characteristics as part of our
recent work in developing an assessment guide to assist federal agencies
in evaluating their training and development efforts.1

o 	Strategic alignment. Clear linkages exist between the agency's mission,
goals, and culture and its training and development efforts. The agency's
mission and goals drive a strategic training and development approach and
help ensure that the agency takes full advantage of an optimal mix of
strategies to improve performance and enhance capacity to meet new and
emerging challenges.

o 	Leadership commitment and communication. Agency leaders and managers
consistently demonstrate that they support and value continuous learning,
are receptive to and use feedback from employees on developmental needs
and training results, and set the expectation that fair and effective
training and development practices will improve individual and
organizational performance.

o 	Stakeholder involvement. Agency stakeholders are involved throughout
the training and development process to help ensure that different
perspectives are taken into account and contribute to effective training
and development programs. Stakeholders' views are incorporated in
identifying needed performance enhancements, developing and effectively
implementing well-thought-out strategies, and helping to conceptualize and
use balanced measures that accurately reflect the extent to which training
and development efforts contribute toward achieving results.

o 	Accountability and recognition. Appropriate accountability mechanisms,
such as performance management systems, are in place to hold managers and
employees responsible for learning and working in new ways. Appropriate
rewards and incentives exist and are used fairly and equitably to
encourage innovation, reinforce changed behaviors, and enhance
performance.

o 	Effective resource allocation. The agency provides an appropriate level
of funding and other tools and resources-along with external

1GAO-03-893G, p. 75.

Appendix III
Core Characteristics of a Strategic Training
and Development Process

expertise and assistance when needed-to ensure that its training and
development programs reflect the importance of its investment in human
capital to achieving its mission and goals.

o 	Partnerships and learning from others. Coordination within and among
agencies achieves economies of scale and limits duplication of efforts. In
addition to benchmarking high-performing organizations, these efforts
allow an agency to keep abreast of current practices, enhance efficiency,
and increase the effectiveness of its training and development programs.

o 	Data quality assurance. The agency has established policies and
procedures that recognize and support the importance of quality data and
of evaluating the quality and effectiveness of training and development
efforts. It establishes valid measures and validated systems to provide
reliable and relevant information that is useful in improving the agency's
training and development efforts.

o 	Continuous performance improvement. Agency practices and policies
foster a culture of continuous improvement and optimal organizational
performance regarding training and other activities. Stakeholders rely on
and use program performance information and other data to assess and
refine ongoing training and development efforts; target new initiatives to
improve performance; and design, develop, and implement new approaches to
train and develop employees.

Appendix IV

Comments from the Department of the Interior

Note: GAO comments supplementing those in the report text appear at the
end of this appendix.

                                  Appendix IV
                      Comments from the Department of the
                                    Interior

The following is GAO's comment on the Department of the Interior's letter
dated January 23, 2004.

GAO Comment	We have clarified the mission statement of the Fish and
Wildlife Service to note its collaboration with others to accomplish its
mission.

                                   Appendix V

                   Comments from the Internal Revenue Service

Appendix VI

Comments from the Office of Personnel Management

Appendix VI
Comments from the Office of Personnel
Management

Appendix VII

                     GAO Contacts and Staff Acknowledgments

GAO Contacts George H. Stalcup or Susan Ragland, (202) 512-6806

Acknowledgments	In addition to the persons named above, K. Scott Derrick,
Gerard Burke, T.J. Thomson, and Thomas Davies, Jr. made key contributions
to this report.

GAO's Mission	The General Accounting Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting its
constitutional responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost
is through the Internet. GAO's Web site (www.gao.gov) contains abstracts
and fulltext files of current reports and testimony and an expanding
archive of older products. The Web site features a search engine to help
you locate documents using key words and phrases. You can print these
documents in their entirety, including charts and other graphics.

Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document files.
To have GAO e-mail this list to you every afternoon, go to www.gao.gov and
select "Subscribe to e-mail alerts" under the "Order GAO Products"
heading.

Order by Mail or Phone	The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are discounted 25
percent. Orders should be sent to:

U.S. General Accounting Office 441 G Street NW, Room LM Washington, D.C.
20548

To order by Phone: 	Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud, 	Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm

Waste, and Abuse in E-mail: [email protected]

Federal Programs Automated answering system: (800) 424-5454 or (202)
512-7470

Public Affairs	Jeff Nelligan, Managing Director, [email protected] (202)
512-4800 U.S. General Accounting Office, 441 G Street NW, Room 7149
Washington, D.C. 20548

                               Presorted Standard
                              Postage & Fees Paid
                                      GAO
                                Permit No. GI00

United States
General Accounting Office
Washington, D.C. 20548-0001

Official Business
Penalty for Private Use $300

Address Service Requested
*** End of document. ***