ITS - Intelligent Transportation Systems Report ITS Home Page

4.0    Evaluation Findings

The evaluation findings are organized in three sections.  Section 4.1 reports the acceptance test results of the FOT voice routing system.  Section 4.2 presents the assessment of the performance of the data routing which shares the OnStar crash data with MnDOT for further dissemination on CARS.  While the first two sections focus on quantitative evaluation of the FOT performance, Section 4.3 provides findings and discussion based on interviews with the key stakeholders of the FOT, including telematics service provider, PSAP operators, call-takers, first responders, traffic operators, the Minnesota 9-1-1 program manager, and FOT team members.

4.1       TSP Voice Routing Field Test

The TSP voice routing field test was undertaken to determine whether the MnDOT MAYDAY/9-1-1 FOT solution demonstrates adequate reliability in routing TSP-initiated emergency calls to an appropriate PSAP with vehicle location (in latitude and longitude) and call back number.

 

Section 4.1.1 presents and discusses the test results.  Section 4.1.2 provides a summary of the findings of the testing.

4.1.1   Test Results

The field test consisted of 190 test calls made over eight days in mid-August of 2005.  Analysis of the test results consisted of first determining whether each call should be counted as an acceptance test record or as a non-counted record.  The official acceptance test calls were then further examined to determine if they passed or failed acceptance.  Table 4-1 below shows the summarized results of the acceptance test.

 

Table 4-1 contains a separate line for each PSAP in the evaluation, identifying its name, the date acceptance sampling was performed, and the planned number of calls for that PSAP.  It then shows the actual number of sample calls made with separate columns for the counts of calls that passed acceptance, failed acceptance, or were not counted toward the acceptance test.  The reasons for non-counted calls are discussed later in this section.  The table contains AACN transfer and Secondary PSAP Routing (Mayo) statistics where these capabilities were available.

 

 

Table 4-1.  Summary Results of FOT Acceptance Testing of Voice Routing Functions

PSAP (County)

Date Sampled

Planned Sample Calls

All Sample Calls

AACN Data1

Secondary PSAP (Mayo) Routing1

Acceptance

Not Counted

 

Correct

 

Incorrect

Not Counted

Success

Failure

Pass

Fail

Kandiyohi

8/10/2005

6

6

0

4

4

13

14

 

 

Renville

8/10/2005

7

7

0

3

6

0

14

 

 

McLeod

8/11/2005

7

7

0

4

6

0

14

 

 

Meeker

8/11/2005

7

7

0

8

 

 

 

 

 

Carver

8/11/2005

7

7

0

3

 

 

 

 

 

Anoka

8/12/2005

7

5

0

3

 

 

 

 

 

8/19/2005

2

0

0

 

 

 

 

 

Hennepin

8/12/2005

7

7

0

0

 

 

 

 

 

Minnetonka

8/12/2005

6

6

0

1

 

 

 

 

 

Eden Prairie

8/12/2005

7

7

0

1

 

 

 

 

 

Olmsted

8/15/2005

7

7

0

3

 

 

 

7

0

Mower

8/15/2005

7

62

0

2

 

 

 

6

0

Steele

8/15/2005

7

7

0

0

 

 

 

7

0

Dakota

8/15/2005

6

6

0

1

 

 

 

 

 

Minneapolis

8/16/2005

6

6

0

3

 

 

 

 

 

Edina

8/16/2005

7

7

0

0

 

 

 

 

 

Scott

8/17/2005

7

7

0

1

 

 

 

 

 

Lakeville

8/17/2005

7

7

0

1

 

 

 

 

 

Burnsville

8/17/2005

6

6

0

0

 

 

 

 

 

Apple Valley

8/17/2005

7

7

0

0

 

 

 

 

 

Eagan

8/17/2005

6

6

0

1

 

 

 

 

 

St. Paul

8/18/2005

7

7

0

3

 

 

 

 

 

Washington

8/19/2005

6

72

0

1

 

 

 

 

 

Total

 

147

147

0

43

16

1

3

20

0

1 Where applicable from the Acceptance Calls

2 Missed last Mower call made up with one extra call in Washington

3 The recorded AACN data does not match the reported call type

4 PSAP operator did not have the telematics application open

 

 

Table 4-1 shows the following:

 

 

The FOT passed acceptance testing with 94% confidence that the true system failure rate is no more than four percent.

 

 

 

Data Quality Issues

Several calls were made where the reported latitude and longitude coordinates did not match their planned locations within the 0.01% criteria identified above.  These included:

 

 

For the first call, review of the data records revealed that the reported coordinates matched that of an attempted sample call made immediately prior to this call at another location.  For the last three calls, their data forms contain notes specifying why the call was not made exactly from the planned location.  It is not known why the second call’s coordinates did not match the plan.  However, in all five cases the reported coordinates matched those of the Telcordia data logs.  This means that the reported latitude and longitude, while not matching the planned location, was consistent with the data actually sent by the OnStar test unit.  Since the acceptance test only applies to FOT functions, these calls were all judged to be successful.  In several other cases with non-matching planned to reported coordinates, the data were not counted toward the acceptance test and were replaced by additional sample calls.  These points are discussed in greater detail below.

 

In seven instances, the data recorded for longitude on the data form was not a negative number.  (Locations with similar latitude but positive longitude as the test points in Minnesota correspond to Northern China near the border with Mongolia.)  This was considered to be a tolerated error made either by the data collector or by the PSAP operator and the test calls were not counted as errors for the acceptance test.  In three other instances, a minor omission or data transmission on the data collection forms were not considered to constitute test failure.  These issues actually support the value of automatic data transmission rather than verbal communication with regard to accuracy.

 

 

Spatial and Temporal Distribution of Sample Calls

Although not required, it was considered desirable to have calls made in diverse geographic regions throughout the FOT area.  Figure 4-1 below shows the geographic sample locations of the calls as relayed by the PSAP and recorded in the data collection forms.  Note that the desired diversity of locations appears to have been met.  It should be pointed out that the PSAPs included both county areas and towns/cities (e.g. Minneapolis, Apple Valley).  In the towns and cities, the smaller geographic area necessarily forced the sample locations to be more closely clustered.

 

Figure 4-1. Spatial Distribution of FOT Voice Routing Acceptance Test Calls. Map showing the location of FOT voice routing test calls in the 22 participating PSAPs.

Figure 4-1.  Spatial Distribution of FOT Voice Routing Acceptance Test Calls

Similar to the geographic diversity, it was desired, but not required that calls be placed at different sample times.  It was not operationally feasible to make calls outside normal working hours.  However, the calls made did show distribution over many hours of the day.  Figure 4-2 illustrates this through a histogram of the 147 acceptance test calls.

 

Figure 4-2. Temporal Distribution of FOT Acceptance Testing of Voice Routing Functions. Bar chart showing that the FOT voice routing test calls were distributed during the day between 9AM and 4 PM.

Figure 4-2.  Temporal Distribution of FOT Acceptance Testing of
Voice Routing Functions

Non-Counted Calls

In addition to the acceptance sample calls, a number of additional calls were placed as part of the acceptance test where the full acceptance criteria were not satisfied.  Under investigation, it was determined in each of these cases that the reason for the failures was outside the control of the FOT.  By agreement in the original acceptance test plan, these calls were categorized as neither successes nor failures with regard to the acceptance test.  Though not included in the evaluation of acceptance, the issues encountered on these calls are nevertheless informative to the FOT.  There were 43 such calls throughout the acceptance test period.  They represent a small number of reasons.  Each reason and accompanying frequency of occurrence is provided below.

 

1)   An incorrect latitude/longitude of (0,0) transmitted by the portable OnStar unit (13 occurrences) – This issue is related to turning on the external, portable OnStar unit for the first time after a shutdown.  The issue was identified before the acceptance test and provisions were made so OnStar would not try to forward calls that have a (0,0) value for latitude and longitude.

2)   Call is not picked up by OnStar Emergency Advisors and call eventually times out (20 occurrences) – The reason for each occurrence of this problem is not known but it may have been due to routing problems internal to OnStar and/or congestion in the OnStar call center where the test calls (by design) had lower priority than true OnStar emergency calls.  Regardless, it was agreed that this issue was not under the control of the FOT. 

3)   OnStar Emergency Advisor’s call encounters busy signal (1 occurrence) – It was discovered that the Telcordia primary rate interface (T1 line) to Verizon was not working which prevented OnStar from calling Telcordia.  This line is outside the FOT control.

4)   GPS coordinates passed through OnStar were from a previous call (4 occurrences) – This issue is related to a warm start (i.e., like cycling the ignition) of the OnStar unit.  It occurs when a GPS lock has not been established and the OnStar unit transfers the last known latitude and longitude coordinates.  In the acceptance test, this even resulted in occurrences where the call was routed to the wrong PSAP when the MnDOT staff person had moved from one PSAP area to another and the first call in the new PSAP area was transmitting GPS coordinates for the last location in the previous PSAP area.  By verifying that the recorded coordinates reported by the PSAP matched those of the Telcordia data transmission logs, it was determined that the FOT process was operating properly.  Therefore, these sample calls could legitimately be considered to pass acceptance, which did occur in a few occasions (see Data Quality Issues above).  The non-counted occurrences of this phenomenon were cases that the evaluation team elected to repeat or replace the sample calls with others that did not have this issue.

5)   ALI record (with latitude and longitude) was not received at PSAP (2 occurrences – Anoka County PSAP, August 12, 2005) – Investigation of this issue provided no specific reason for the failed transmissions except they were shown not to be attributable to the FOT.  Telcordia’s logs show the LAT/LON data were received from OnStar and were subsequently sent to the ALIs.  It is therefore assumed that the failure was at the PSAPs or ALIs, and this is outside the control of the FOT.

6)   OnStar incorrectly routed call (3 occurrences) – This appears to have been an OnStar training issue and is not attributable to the FOT.

4.1.2   Summary of Findings

4.2       ACN/AACN Data Routing Evaluation

The data routing portion of the FOT sought to develop and test a system for a telematics service provider (e.g., OnStar) to push ACN and AACN data into a MnDOT SOAP server for further distribution to CARS. 

 

Due to data availability, this evaluation only examined the performance of the first link of the data distribution between OnStar and SOAP server, as shown in Figure 4-3.  The system performance is defined as the reliability and latency of the data transfer.  Different from the voice routing evaluation, the data routing involved the transmission of actual OnStar crash data throughout the state of Minnesota.

 

Figure 4-3. Focus of Data Routing Performance Evaluation. Diagram describing the data routing portion of the FOT.  The diagram shows that the crash data are provided by OnStar to the MnDOT SOAP server and further distributed on CARS.  The data routing evaluation is focused on the link between the OnStar and the SOAP server.

Figure 4-3.  Focus of Data Routing Performance Evaluation

Section 4.2.1 contains the results of the characterization of the data routing system and its estimated reliability and latency.  Section 4.2.2 summarizes the findings of the evaluation.

4.2.1   Analysis Results

The data routing function from OnStar to the SOAP server was characterized through basic summary statistics and was then analyzed for both reliability and latency.  Different from the voice routing test, the data routing portion of the FOT transmitted the actual OnStar accidents in Minnesota over the period of the FOT.

 

 

Characterizing the System Usage

Over the 41 weeks for which data were available, OnStar transmitted 1,297 crash records to the SOAP server.  Fifty of these calls failed to be received and are discussed further below.  Of the 1,247 calls that were successfully received, there were:

 

 

Over each week, the total number of calls received varied.  After August 7, 2005, the number dropped considerably as OnStar stopped sending SOS calls to the SOAP server.  During the rest of the time period (and ignoring the weeks where no data are available) OnStar sent an average of 34 calls per week to the SOAP server.  The week with the most calls was August 1 – August 7, 2005 when 50 total calls were delivered.  This is shown graphically in Figure 4-4.  The figure also shows the relative number of calls for each type of success (AACN, ACN, and SOS) as well as any failures.  As noted above, the sharp dropoff after week 63 corresponds to OnStar discontinuing transmission of SOS calls.

Reliability of Data Transmission

In two weeks during the period, there were instances of failed transmission of crash data.  These consisted of:

 

2005.04.18-to-2005.04.24  14 failures

2005.04.25-to-2005.05.01  36 failures

 

The failures were due to a computer security key not being updated.  This prevented any files from being exchanged.

 

While these failures were explained and resolved, they do serve to illustrate the potential for failures in this or any data network systems.  Over the 41 weeks a total of 50 test failures were observed compared to 1,297 total OnStar calls (successes and failures).  This represents a failure rate of 3.9%.

 

Figure 4-4. FOT Data Routing Calls from OnStar to SOAP Server October 11, 2004 to September 4, 2005. Bar chart showing the distribution and number of crash records transmitted from OnStar between October 11, 2004 and September 4, 2005.  Each bar represents one week of data and is broken into color coded segments to show the number of different types of OnStar crash data calls sent, including SOS, ACN, and AACN; as well as crash records which failed to be received by the SOAP server.

Figure 4-4.  FOT Data Routing Calls from OnStar to SOAP Server
October 11, 2004 – September 4, 2005

Latency

In evaluating latency of the transmission system, the objective is to determine an average delivery time per call.  Even though we do not have the delivery time for the individual call records, we can determine the overall average delivery time per call as:

 

Equation for the average delivery time per call.  It consists of multiplying the average time for each week's calls by the total number of calls in each week, summing these terms, and dividing by the number of weeks.

 

where

 

i is the week number

n is the total number of weeks (=41)

ci are sums of AACN, ACN, and SOS successful calls for each week

X bar subscript i.  are the average call times for each week

 

This calculation may not perfectly match the value obtained with individual call records due to rounding issues.  However, it is expected to be close.

 

The mean latency is 0.9 seconds.

 

It is concluded that the first stage of the data routing functionality within the FOT demonstrated high reliability (96.1%) in delivering every OnStar record sent to the SOAP server.  Aside from a few anomalous cases, these records were sent in an average of less than one second.

 

Many other factors may be considered in evaluating the data routing function.  This evaluation did not have a means for independently verifying that the tabulated calls were the only ones sent by OnStar or for independently validating the accuracy of the data elements transferred.  It is assumed these summaries were correctly prepared by OnStar.  This evaluation also did not have the raw data of transmission time for each call so that the variability in transmission time could not be analyzed.  However, the very low average transfer times likely make any issues of transmission time variance insignificant on the scale observed (i.e., less than one second).  More importantly, though, this evaluation did not have a means to evaluate the full time interval of relaying the OnStar data to the CARS database- a true latency measure from an end user’s stand point.

4.2.2   Summary of Findings

4.3       User Acceptance and Deployment Issues Evaluation Results

This section documents the findings from the interviews with the key stakeholders of the FOT, including for voice routing: PASP operators, call takers, and OnStar; and for data routing: emergency responders and MnDOT traffic operation managers who share the OnStar ACN data via the CARS.  In addition, deployment issues and broader implications were gathered through interviews with the FOT team, the Minnesota 9-1-1 program manager, and the NENA technical issues director.

4.3.1   PSAP, Medical Responder, 9-1-1 Program Manager, Traffic Operator Feedback

            4.3.1.1            Interview Results

The PSAP interviews were structured around a set of standard questions, consistent with the intent, scope and goals of the FOT, and addressed the following subject areas:

 

 

The Mayo Clinic and the Mayo Medical Transport, as emergency medical responding and treatment service entities, currently have access to real-time ACN data shared through MnDOT’s CARS.  Interviews with selected Mayo Clinic representatives were conducted in conjunction with the process outlined above, and addressed the following subject areas:

 

 

The benefit of incident data is also important to MnDOT and traffic management operations.  To that end, interviews were conducted with individuals involved in processing ACN/AACN-related CARS data for the purpose of coordinated traffic incident management.  Topics included the following:

 

 

Five primary PSAPs were selected for post test interviews, including the counties of Kandiyohi, Olmsted, and Anoka, and the cities of Burnsville and Minneapolis.  These PSAPs provided a good mix of PSAP environments, including 9-1-1 Service Provider (i.e., Quest and IES), PSAP service environment (i.e., urban, suburban, rural), and size of PSAP.  The Mayo Clinic was involved as both a secondary PSAP (emergency response), and, as a CARS data user, as was the Roseville Regional Transportation Management Center (RTMC).  A majority of the above interviews took place during the week of September 19, 2005, and involved five coordinated meetings.

 

In many locations around the country, 9-1-1 services are often funded, coordinated and/or facilitated by cognizant 9-1-1 authorities or agencies specifically established for that purpose.  Such authorities range from regional, multi-jurisdictional coordinating bodies, to state agencies or functions.  That is true in Minnesota as well, and two such organizations were specifically involved in this FOT and interviewed, including the Minnesota Statewide 9-1-1 Program (Department of Public Safety) and Minneapolis/St. Paul Metropolitan Emergency Services Board.

 

Interview questions for the above organizations covered the following subject areas:

 

            4.3.1.2            Summary of Findings

The results of the interviews were fairly consistent among all the PSAPs visited and interviewed.  Following is a summary of those interviews, and the agencies’ observations and experiences with the FOT: 

 

 

Figure 4-5. Selected PSAPs where Interviews Were Conducted.   Set of photographs taken at selected PSAPs during the evaluation interviews and showing computer equipment and work stations.

Figure 4-5.  Selected PSAPs where Interviews Were Conducted

 

Figure 4-6. Use of ACN Data in CARS by Minnesota DOT RTMC. Photographs taken at Minnesota Department of Transportation’s Regional Transportation Management Center in Roseville, Minnesota during the evaluation interviews.

 

Figure 4-6.  Use of ACN Data in CARS by MnDOT RTMC

 

4.3.2   Telematics Service Provider Feedback

            4.3.2.1            Interview Results

OnStar customer service and corporate representatives were interviewed on October 19, 2005 at OnStar headquarters in Detroit, Michigan.  That interview was structured around the following areas:

 

Emergency Advisors

 

 

Corporate Representatives (representing corporate planning, industry affairs and service development)

 

            4.3.2.2            Summary of Findings

OnStar, as the telematics service provider involved in the FOT has a long history servicing customer emergency calls, and interacting with the 9-1-1 community.  Emergency Advisors receive special training in processing such calls, and specifically interfacing with 9-1-1 call takers.  The following is a summary of the above interview meeting, and the observations and FOT experiences shared: 

 

4.3.3   Implications for NG9-1-1

            4.3.3.1            Third Party Technical Review

In accordance with the evaluation plan, the FOT concept, technical design, and results of the field test were reviewed with Mr. Roger Hixson, NENA’s Technical Issues Director.  The interview specifically related to NENA’s work in the area of Next Generation E9-1-1 systems, recognizing that the latter will ultimately generate or facilitate enhanced solutions for the objective of this FOT. 

 

Mr. Hixson observed that, while the design involved would require new TSPECRS-specific functionality in the wireless network, a potentially challenging matter in light of evolving wireless networks, the concept and solution being tested by the FOT represented a “viable interim method” to automatically route emergency calls being generated by customers through telematics call centers to “native Enhanced 9-1-1 systems in the USA.”  Ultimately, Mr. Hixson indicated, IP based, NG 9-1-1 systems are likely to offer “more effective and seamless solutions for ACN calls and basic data delivery, and subsequent PSAP access to supportive data from Telematics provider databases.” 

            4.3.3.2            Summary of Findings

The following summarizes the NENA observations provided above:

 

            4.3.3.3            Project Team Observations

In accordance with the FOT Evaluation Plan, a post-trail project team member interview was conducted on November 10, 2005.  The intent of the interview was to give all team members an opportunity to comment on the results of the FOT; solicit lessons learned and observations about the potential for FOT concept application in real world environments.  Much of that discussion is also reflected in the FOT project reports [Ref 2, 3].

            4.3.3.4            Summary of Findings


28 ANI is the technical term for call back number or caller ID

29 In this case, call location information in LAT/LON

30 Indication of a wireless 9-1-1 call made from a TSP

31 Full deployment requires all wireless service providers to implement the E9-1-1 phase 2 solutions

Previous  | Table of Contents  |  Next