Evaluation
ITS Evaluation Guidelines – ITS Integration Self-Evaluation Guidelines

United States Department of Transportation
Intelligent Transportation Systems
Integration Program
Self-Evaluation Guidelines

Prepared for:

Joseph I. Peters, Ph.D.
Manager, ITS Program Assessment
U.S. DOT ITS Joint Program Office
Federal Highway Administration (HVH-1)
400 7th Street, SW
Washington, DC 20590

Prepared by:

Science Applications International Corporation

February 2001

Foreword

The ITS Integration Program is being conducted to accelerate the integration and interoperability of Intelligent Transportation Systems in metropolitan and rural areas. Projects funded under this program are generally intended to improve transportation efficiency, promote safety, improve traffic flow, reduce emissions, improve traveler information, and promote tourism. An important element of this program is to assess how well the projects perform at meeting these goals and to share their experiences with others. Therefore, all projects under this program are required to perform a self-evaluation funded from project resources and are required to annually submit cost information. This document presents guidelines for the conduct of the required self-evaluation study.

The material presented herein is designed to assist local agencies and project partnerships in preparing products and deliverables that are required through self-evaluations. The document offers guidance to local agencies in developing a self-evaluation process, and offers examples and guidelines to consider when preparing for a self-evaluation. Finally, this document describes the process to be used in submitting final self-evaluation products and reports. Each project will be able to submit their self-evaluation products electronically through the use of an internet-based system.

I. Introduction

A. Background and Purpose

The ITS Integration Program is being conducted to accelerate the integration and interoperability of intelligent transportation systems in metropolitan and rural areas. An important element of the program is to assess how well the selected projects perform at meeting program goals and to share their experiences with others. Consequently each recipient of ITS Integration Program funds is required to perform a self-evaluation. Each recipient of funding must submit a Local Evaluation Report documenting the lessons learned in meeting project goals and objectives. This report shall address key aspects of the project, and to the extent possible, assess impacts on the relevant outcome measures (i.e., mobility, safety, efficiency, productivity, and energy and emissions).

In addition to the preparation of a Local Evaluation Report, recipients of ITS Integration Program Funds are required to collect and document cost data on an annual basis. Cost data collection guidelines and other resources for supporting evaluations are available at the ITS Joint Program Office Web site.

The purpose of this document is to guide the conduct of a self-evaluation leading to the preparation of a Local Evaluation Report. This document offers guidance to local agencies and project partnerships in developing a self-evaluation process, and offers examples and guidelines to consider when preparing for self-evaluation. As project implementation proceeds, the self-evaluation is expected to address how well the project meets its goals and objectives.
Additionally, based on the project's evaluation strategy, two or more of the following evaluation activities shall be undertaken as part of the self-evaluation process and documented in the Local Evaluation Report:

  • Evaluation of institutional issues associated with achieving cooperation among public sector agencies and documenting how they were overcome.
  • Development of a brief lessons learned report on the technical and institutional issues encountered in integrating ITS components.
  • Development of an evaluation report on the lessons learned in employing innovative financing or procurement and/or public-private partnering techniques.
  • Preparation of a lessons learned report on the experiences, challenges, and approaches used in achieving consistency with the National ITS Architecture and/or implementation of ITS standards.
  • Production of a case study on the planning process used to achieve integration into an approved plan and program developed under an area-wide (statewide and/or metropolitan) planning process which also complies with applicable State air quality implementation plans.
  • Description of how the metropolitan planning process was provided with data generated by ITS technologies and services and a report outlining plans or intentions for archiving the data and using it.

B. Self-Evaluation Process

The following steps comprise the self-evaluation process:

  • Form the Evaluation Team. Each of the project partners and stakeholders designates one member to participate on the evaluation team. The program manager should designate an evaluation team leader. Experience has demonstrated that formation of this team early in the project is essential to facilitating evaluation planning along a "no surprises" path. Participation by every project stakeholder is particularly crucial during the development of the "Evaluation Strategy."
  • Develop the Evaluation Strategy. A major purpose of the self-evaluation strategy is to focus partner attention on identifying the goal areas that have priority for their project. ITS goal areas include traveler safety, traveler mobility, transportation system efficiency, productivity of transportation providers, and conservation of energy and protection of the environment. Partners may assign ratings of importance to the goal areas and then establish evaluation priorities in a manner consistent with these priorities.

    The additional evaluation activities that will be performed also need to be revisited during the evaluation strategy phase. These activities were chosen by the project management and listed in your project description package that was submitted as part of your application for ITS Integration Funds.
  • Develop the Self-Evaluation Plan. The evaluation plan builds upon the goals and priorities set during the evaluation strategy phase. The evaluation plan should identify hypotheses to be tested during the evaluation. Hypotheses are "if-then" statements about the expected outcomes after the project is deployed. For example, a possible goal of coordinating traffic signals across jurisdictions is improving safety by reducing rear-end collisions. If the evaluation strategy included this goal, the evaluation plan would formulate a hypothesis that can be tested. The hypothesis might be, "If jurisdictions coordinate traffic signals, then rear-end collisions will be significantly reduced at intersections near jurisdictional boundaries." The evaluation plan identifies all such hypotheses and then outlines the different tests that might be needed to test the hypotheses.

    The evaluation plan should also list the additional evaluation activities that will be included in the Local Evaluation Report. More information on performing these activities is listed under Section III.
  • Collect and analyze data and information. This step is the implementation of the evaluation strategy. It is in this phase where careful cooperation between partners and evaluators can save money. For example, by early planning, it is possible to build into the ITS project capabilities for automatic data collection. Partners can use data collection after the evaluation is completed to provide valuable feedback with regard to the performance of the system. Such feedback can help in detecting system failures and to improve system performance.
  • Document strategy, plans, results, conclusions, and recommendations in a Local Evaluation Report. The final product of the self-evaluation effort is a self-standing Local Evaluation Report that describes the project and documents the findings from the evaluation and the two additional evaluation activities.

C. Submitting the Deliverables

An ITS Integration Program self-evaluation progress system has been developed to assist the ITS Joint Program Office in tracking deliverables. The system is accessible via the World Wide Web at http://www.itsevaluation.net. Each Earmark project should access this Web site to upload project deliverables, including the Local Evaluation Report and cost information, and project points of contact information.

II. Local Evaluation Report

The Local Evaluation Report should describe the project, define goals of the project, and document how the goals were (or were not) achieved. The report should address not only technical issues involved in project, but also discuss institutional issues.

Projects deploying intelligent metropolitan or rural infrastructure are expected to allocate resources to evaluate the impact (or impacts) their projects have in certain major goal areas.
ITS goal areas have traditionally included:

  • Traveler safety
  • Traveler mobility
  • Transportation system efficiency
  • Productivity of transportation providers
  • Conservation of energy and protection of the environment
  • Others as may be appropriate to unique features of a project

Each of these goal areas can be associated with outcomes of deployment that lend themselves to measurement. These outcomes resulting from project deployment are identified as measures. The association of goal areas and measures is depicted as follows:

Table 1. Measures of Effectiveness within Each Goal Area
Goal Area Measure
Safety
  • Reduction in the overall Rate of Crashes
  • Reduction in the Rate of Crashes Resulting in Fatalities
Mobility
  • Reduction in the Rate of Crashes Resulting in Injuries
  • Reduction in Delay
  • Reduction in Transit Time Variability
  • Improvement in Customer Satisfaction
Efficiency
  • Increases in Freeway and Arterial Throughput or Effective Capacity*
Productivity
  • Cost Savings
Energy and Environment
  • Decrease in Emissions Levels
  • Decrease in Energy Consumption

The "few good measures" in the preceding table constitute the framework of benefits expected to result from deploying and integrating ITS technologies. Other projects may have goals that fall outside of the traditional "few good measures", and may include the following:

  • Deployment of infrastructure required to support ITS
  • Creation of a regional architecture
  • Creation of a system to archive data

Goals need to be identified for each individual project based on the type of project being deployed. In cases where the traditional "few good measures" are not applicable, the evaluation should document how well the project met the goals of the deployment set forth in the project's application for participation in the ITS Deployment Program. Potential areas for evaluation include the following:

  • Implications of achieving consistency with the National ITS Architecture
  • Standards implementation
  • Consumer acceptance
  • Others as appropriate to local considerations
  • Institutional issues

An area of special emphasis should be the non-technical factors influencing project performance. ITS projects have been profoundly influenced by considerations such as procurement practices, contracting policy, organizational structure, and relationships among major participants such as prime contractors and their subcontractors. The transportation community stands to reap significant benefit from understanding how the varied range of non-technical factors impacts directly on traditional project performance parameters, such as, cost, schedule, and final functionality.

Table 2. Suggested Draft Outline for a Local Evaluation Report

Executive Summary
Project Description
- Project background
- Level and types of integration
- Institutional involvement
Evaluation Plan
- Goals, objectives, and measures of effectiveness
- Hypotheses
- Additional elective activities being performed
(two out of a possible six are required)
Evaluation Findings
- Project outcome based on measures of effectiveness
- Lessons learned report
- Institutional issues
- Findings from the additional evaluation activities

The additional evaluation activities are described in Section III of this document. These activities can be included as part of the Local Evaluation Report or can be separate documents. Previous evaluations can serve as examples for preparing the Local Evaluation Report. Evaluation reports can be found on the National Transportation Library.

III. Additional Evaluation Activities

Two or more of the following evaluation activities are required in addition to the Local Evaluation Report.

  • Evaluating institutional issues associated with achieving cooperation among public sector agencies and documenting how they were overcome.
  • Providing a brief lessons learned report on the technical and institutional issues encountered in integrating ITS components.
  • Providing an evaluation report on the lessons learned in employing innovative financing or procurement and/or public-private partnering techniques.
  • Producing a lessons learned report on the experiences, challenges, and approaches used in achieving consistency with the National ITS Architecture and/or implementation of ITS standards.
  • Producing a case study on the planning process used to achieve integration into an approved plan and program developed under an area-wide (statewide and/or metropolitan) planning process which also complies with applicable State air quality implementation plans.
  • Providing the appropriate metropolitan planning process with data generated by ITS technologies and services, and provide a report on plans or intentions for archiving the data and using it.

These activities may be included as part of the local evaluation report, or may be submitted individually. If submitted as individual, stand-alone documents, then the report should also include a brief description of the project. The following are guidelines for performing each of the six evaluation activities.

Activity 1. Evaluating institutional issues associated with achieving cooperation among public sector agencies and documenting how they were overcome.

Evaluating the institutional issues of achieving cooperation among public sector agencies involves investigating organizational, jurisdictional, financial, and / or regulatory/legal issues. This evaluation typically focuses on the non-technical impediments and challenges that were encountered among the public sector agencies and how those impediments were overcome. On a more positive note, ideas and processes that were conducive to the project should also be documented. The evaluation of institutional issues should result in lessons learned that will assist other agencies looking to deploy ITS.

Examples of organizational/jurisdictional challenges may include clarification of participant responsibilities, role expectations, staffing levels, and other inter-agency partnership issues. Each partnering agency brings to the project its own set of goals and values. Reconciling these differences may present a barrier to the project. Financial issues may also affect the project. These issues include procurement processes, cost-sharing issues, or other financial-related complexities. Various agencies may have different regulatory/legal responsibilities such as dissimilar contracting and auditing responsibilities that may pose challenges to cooperation between agencies.

The following institutional issues have been seen in other deployments of intelligent transportation systems and should be addressed in this evaluation to the extent that they affected each project. The issues listed are not comprehensive. Different institutional issues may arise in each individual project and should be addressed accordingly. The issues include:

  • Organizational Issues
  • Human Resource Issues
  • Public Acceptance Issues
  • Regulatory/Legal Issues
  • Financial Issues
  • Other Issues

There are common questions that should be asked when investigating institutional issues and they are listed below. These questions are not inclusive, but are meant to guide and to provide examples of the types of questions to ask project partners.

  • What institutional impediments did the project participants encounter while establishing the partnership?
  • Where in the life-cycle of the project did the impediments occur?
  • What were the causes of the impediments and how were they overcome?
  • How were the different missions of each of the partnering organizations merged?
  • How was ITS facilitated within each of the partnering organizations?
  • How did you get buy-in from all the partners on the project's goals and objectives?
  • How do each of the partners assess the risks and benefits of the project?
  • What are each of the partners roles and responsibilities?
  • How were the roles and responsibilities defined? How were they made clear to each of the partners?
  • Is there a regional steering committee or working group that oversees the project's activities? Please describe how the group works, and if any benefits were realized from having the group. Did the partnering agencies dedicate staff or other resources as necessary?

The data gathering process for institutional issues studies tends to focus on identification of impediments and problems encountered during the different phases of project. The results of these studies not only document key elements of the study (e.g., history of the project, system description, partnership agreement), and issues and problems encountered, but, more importantly, lessons learned that can be applied to other deployments are also presented.

A variety of reports addressing institutional issues are available from the ITS Electronic Document Library. One such report that identified and evaluated the institutional structures and working relationships in the deployment and integration of ITS products and services is "Successful Approaches to Deploying a Metropolitan Intelligent Transportation System".

Activity 2. Providing a brief lessons learned report on the technical and institutional issues encountered in integrating ITS components.

This lessons learned report requires the identification of the technical and institutional issues encountered in integrating ITS components. The nine basic ITS components are freeway management, incident management, traffic signal control, transit management, electronic toll collection, electronic fare payment, highway-rail intersections, emergency management services, and regional multi-modal traveler information. Deploying integrated systems is inherently more complex and requires a higher level of coordination between different organizations than independent systems.

The focus of this activity is upon technical and institutional challenges that impeded or were conducive to the progress of integrating ITS components. Examples of technical issues encountered may include standards and protocol compliance, hardware or software development issues, cost and budget constraints, integrating existing legacy systems with newly implemented systems, etc. Institutional issues may include organizational, jurisdictional, financial, and / or regulatory/legal issues similar to those described in Activity 1.

The following technical issues have been experienced in other ITS component integration projects and should be addressed in this evaluation, if applicable. The issues include:

  • Integration of legacy systems with new systems
  • Technology ahead of staff's training level
  • Use of standards and protocols
  • Software and/or hardware development
  • Insufficient or incompatible infrastructure

Integration between ITS components requires working through institutional issues to develop common goals and to meet needs of participating organizations. Traditionally, the agencies in charge of each component area have not worked together and come to the project with different goals and objectives. How these obstacles were overcome may be of benefit to others trying to integrate ITS components. The following institutional issues have been encountered in ITS component integration projects.

  • Adopting common goals and objectives
  • Gaining trust of each component's agency
  • Sharing of control
  • Sharing of infrastructure
  • Organization and management structure

There are common questions that should be asked when investigating integration issues and they are listed below. These questions are not inclusive, but are meant to guide and to provide examples of the types of questions to ask project partners.

  • What institutional impediments did the project participants encounter while working with different agencies to achieve integration?
  • Where in the life cycle of the project did the impediments occur?
  • What were the causes of the impediments and how were they overcome?
  • How were the different missions of each of the partnering organizations merged?
  • How was ITS facilitated within each of the partnering organizations?
  • How did you get buy-in from all the partners on the project's goals and objectives?
  • How do each of the partners assess the risks and benefits of the project?
  • What technical issues were encountered while integrating different components?
  • Were you able to apply lessons learned from similar deployments to your project's deployment?

The lessons learned report should be succinctly described along with relevant conclusions and recommendations. A variety of lessons learned reports are available from the ITS JPO Web site at: http://www.its.dot.gov/library.htm. For example, Seattle Wide-Area Information for Travelers (SWIFT) Institutional Issues Study investigated the institutional issues (e.g., policies, jurisdictional issues, internal and external factors) that affected design, development, testing, deployment and conduct of the SWIFT Field Operational Test (FOT). The study documented how these issues were overcome and what lessons could be learned. A good source for integration is the document titled, Measuring ITS Deployment and Integration.

Activity 3. Providing an evaluation report on the lessons learned in employing innovative financing or procurement and/or public-private partnering techniques.

This activity requires identification of the lessons learned resulting from use of innovative financing, procurement and/or public-private partnering techniques and describing what was (or could be) done to improve the process. Both positive and negative lessons learned may be documented.

When focusing on innovative financing, lessons learned may be derived from a variety of areas. Examples include use of financial leveraging tools, credit enhancement mechanisms, State Infrastructure Banks (SIB's) or other related areas. For innovative procurement, lessons learned may come from areas such as procurement processes, valuation of private resources, regulatory issues, etc. Finally, lessons learned for public-private partnering techniques may come from areas such as project management structure, organizational coordination, handling differing organizational priorities, structure of public-private partnerships, handling intellectual property rights, etc.

When documenting the lessons learned, the report should provide a brief background description of the project, evaluation strategy, and plans. The lessons learned should be succinctly described along with relevant conclusions and recommendations. To examine other lessons learned reports visit the ITS JPO Web site at: http://www.its.dot.gov/library.htm. One example, discussing issues related to public-private sharing of telecommunication resources can be reviewed in Shared Resources: Sharing Right-Of-Way for Telecommunications Guidance on Legal and Institutional Issues.

The following questions are recommended for use in preparing a report on lessons learned in employing innovative financing and procurement techniques:

  • What type of financing was used for the project?
  • What types of innovative contracting mechanisms were used (federal competitive process, state catalog, sole-source, design/build, or other)?
  • Describe the funding sources (CMAQ, etc.)?
  • Will any federal aid funds be used for operations and maintenance?
  • Describe any software development issues in who retains rights to intellectual property?
  • How were the procurement capabilities of the participants in the ITS project identified?
  • How were representatives from the participating agencies with the required procurement skills identified?
  • How was the agency or agencies with the capability to lead the procurement process for the ITS project selected?
  • Was a single point of contact used for the lead procurement agencies?
  • Were public safety and other non-traditional organizations included in the procurement process?
  • Were any lessons learned from other similar deployments applied to your deployment? If so, what were they?

Activity 4. Producing a lessons learned report on the experiences, challenges, and approaches used in achieving consistency with the National ITS Architecture and/or implementation of ITS standards.

This lessons learned report should identify the experiences, challenges, and approaches used to successfully implement ITS that is consistent with the National ITS Architecture and / or conforms to ITS Standards.

The lessons learned may be derived from the experiences, challenges, and approaches used in conforming to standards, issues related to interoperability, standards training, policy issues, operations and maintenance issues, benefits of standards compliance, and / or other architecture/standards issues.

The lessons learned report should provide a brief background description of the project, evaluation strategy, and plans. The lessons learned should be succinctly described along with relevant conclusions and recommendations. For a brief discussion of experiences and challenges related to technical issues (e.g., standards, technology selection), see Section 3.5 of ITS Institutional and Legal Issue Program, Analysis of ITS Operational Tests, Findings and Recommendations. The report titled Seattle Wide-Area Information for Travelers (SWIFT) Institutional Issues study is also an example of how to achieve consistency with the National ITS Architecture. Other lessons learned reports are available from the ITS JPO Web site at http://www.its.dot.gov/library.htm.

The following questions are recommended for use in preparing a report on lessons learned on the National ITS Architecture and/or standards implementation:

  • How was a regional/national architecture referenced and/or followed throughout the design of the project?
  • Was a preliminary deployment plan developed?
  • What strategies were developed to ensure consistency with the National Architecture?
  • Was interim guidance used to set a path for the project?
  • How did the use of standards affect the costs of the project over the project life cycle?
  • Were existing working relationships used to facilitate the deployment of ITS?
  • Were forums for transportation officials and public safety officials identified?
  • Were members of these groups approached to solicit their involvement in planning and developing ITS strategies?
  • What nontraditional public and private organizations were asked to participate in the project?
  • Were previous ITS and other transportation studies identified and referenced?
  • What types of plans were developed to ensure integration with existing systems?

Activity 5. Producing a case study on the planning process used to achieve integration into an approved plan and program developed under an area-wide (statewide and/or metropolitan) planning process which also complies with applicable State air quality implementation plans.

The case study should identify and describe ways that ITS was integrated into the transportation planning process and resulting transportation plans and / or transportation improvement programs. The case study provides a comprehensive view of the process and focuses on what actually occurred, key advantages/disadvantages, and current/future impacts on the planning process.

In addition to describing the case study methodology (evaluation team, strategy, plans, etc.), the report should provide background information about the planning process, observations or recommendations, and any conclusions. A variety of case studies are available from the ITS JPO Web site at: http://www.its.dot.gov. Integrating ITS and Traditional Planning -Lessons Learned I-64 Corridor Major Investment Study The I-64 Study is one example describing ideas and suggestions on how to integrate ITS and traditional planning practices.

Some questions to consider when developing a case study for this activity are as follows:

  • Were ITS projects given a reasonable weight compared with traditional construction projects when developing long range plans?
  • What ITS long range planning policies were used to guide TIP development?
  • What are the goals and objectives of ITS in the long range plan?
  • Describe relationships with partnering agencies as related to planning process.
  • How were benefits/costs analyzed in selecting project priorities?
  • What performance measures can be used to evaluate projects and build knowledge from lessons learned?
  • Describe any business plan that may have been written for the ITS project?
  • Describe any incentives used to encourage private sector involvement in the project?

Activity 6. Providing the appropriate metropolitan planning process with data generated by ITS technologies and services, and provide a report on plans or intentions for archiving the data and using it.

This activity requires a description of the metropolitan planning process, the types of ITS data being generated, and how the data will be analyzed and stored. A useful description of the data may include the sources of ITS data, how and what type of data is being recorded, and data flow diagrams showing the flow from the source to data utilization to archive. A description of the analysis should include the models and / or any analysis techniques to be used.

Some questions to consider when developing a case study for this activity are as follows:

  • How are data on existing systems currently archived?
  • Was a policy for the archiving of data in place or was one developed as a result of the decision to archive data?
  • What benefits data were used to determine the feasibility of the project?

For examples of reports on archiving data and analysis of ITS data check the ITS JPO Web site at http://www.its.dot.gov/library.htm.

IV. Additional Information

Self-evaluations are critical to the progress of ITS deployments. It is intended that these guidelines assist the evaluation team in planning and performing an evaluation that will produce results beneficial to the ITS community.

Additional information on evaluations can be found on the ITS Web site at http://www.its.dot.gov. The site has links to additional evaluation guidelines, including the TEA-21 ITS Evaluation guidelines and the Unit Cost Collection Guidelines.

 

Additional ITS Resources on the Federal Highway Administration Office of Operations Website




RITA's privacy policies and procedures do not necessarily apply to external web sites.
We suggest contacting these sites directly for information on their data collection and distribution policies.