ITS - Intelligent Transportation Systems Report ITS Home Page

3.0    Evaluation Approach

3.1       Evaluation Objectives and Scope

The objective of the FOT was to demonstrate an innovative solution for automated routing of Telematics-originated emergency calls to an appropriate PSAP as a native 9-1-1 call, and the real-time sharing of the Telematics crash data with various emergency response and management entities via MnDOT’s CARS system. 

 

The basic objectives of this independent evaluation were:

 

 

It is important to note that the evaluation study takes into consideration the known constraints of the FOT.  Such constraints include:

 

 

In assessing the attainment of the evaluation objectives, one must consider the limitations imposed by the above constraints.

 

The scope of this evaluation, based on discussions with the FOT team and NHTSA, was determined to be:

 

 

The evaluation approach discussed in this section is organized with respect to the quantitative analysis of system performance (Section 3.2) and qualitative evaluation of user acceptance and deployment issues (Section 3.3).

3.2       System Performance Evaluation Approach

Compared to the existing TSP emergency call procedure, the FOT voice routing and data sharing procedure incorporates two important differences:

 

 

The approaches for evaluating the voice and data routing are described in Sections 3.2.1 and 3.2.2, respectively.

3.2.1   Voice Routing System Acceptance Testing

The voice routing is a core functional element of the proposed FOT technologies that allows emergency calls initiated by OnStar customers to be routed, via an OnStar Emergency Advisor, and delivered to appropriate PSAPs as a native 9-1-1 call.  With the inherent ability of OnStar to locate a calling customer, the FOT voice routing technology was designed to take advantage of existing wireless 9-1-1 call delivery infrastructure, and improve or enhance the processing of telematics-based emergency events.

 

The purpose of acceptance testing was to establish that the system for routing a voice call from an OnStar Emergency Advisor through to the appropriate PSAP operates reliably and accurately.  The approach for conducting the acceptance test was to perform a field evaluation where wireless OnStar calls are placed in 22 PSAPs22 throughout the state of Minnesota.  For each call, data are collected regarding the success or failure of the voice routing system to establish a connection with the appropriate PSAP through the 911 trunk line and to accurately transmit data (e.g., latitude/longitude, call back number, and available ACN/AACN data) related to the original call.  Each sample call is ultimately determined to be a success or a failure relative to the system requirements.

            3.2.1.1            Description of Field Acceptance Test

The field test applied a statistical acceptance sampling approach to the evaluation of the FOT voice routing functionality so that one of two ultimate conclusions could be reached:

 

1. The system passes final validation testing.  The statistical interpretation is that there is insufficient evidence to reject an initial assertion that the overall system failure rate is less than one percent.  The sample size and acceptance criteria are such that this conclusion is protected from error with 94 percent confidence that the true system failure rate does not exceed four percent.

 

2.  The system fails the final validation test.  The statistical interpretation is that there is sufficient evidence (with 94 percent confidence) to conclude that the true system failure rate is greater than one percent.

 

The selection of a four percent failure rate against which to conclude system passing and a one percent failure rate against which to conclude system failure were decisions of the client.  These are reasonable levels of performance for this type of system.  Note that the 94 percent confidence levels are slightly lower than the general statistical standard of 95 percent.23  This is a result of the discrete (versus continuous) nature of counting numbers of successes and failures.

 

Secondary objectives were to assess whether ACN/AACN data were correctly transferred in those PSAP areas where this functionality is enabled and whether call data would be successfully routed to the Mayo Clinic in those areas where this capability is routinely required.

 

The selected acceptance sampling design utilized a sequential approach.  Under this approach, a total planned sample size of 294 calls is split into two sampling stages of 147 calls.  At the conclusion of the first stage, the total system errors are tabulated and one of three conclusions is reached:

 

1.  If zero or one errors are found, the system is concluded to pass the acceptance test.

 

2.  If two to four errors are found, a conclusion of acceptance or failure is postponed until another stage of 147 calls is completed.  After completion of the second stage, the total errors (first stage plus second stage) are tabulated and if the number is six or less, the system passes acceptance.  If the errors total seven or more, the system fails acceptance.

 

3.  If five or more errors are found, the system is concluded to fail the acceptance test.

 

This sequential acceptance sampling plan is beneficial because, for a given total sample size, it produces close to the same statistical properties (i.e., confidence levels for correctly concluding the system passes or fails acceptance given true population error rates) of a single stage acceptance sampling plan but with a possibility of reaching an early conclusion.  In this case, a conclusion of passing acceptance can be reached in one half the samples compared to a single stage acceptance sampling plan.  The statistical derivation related to this sampling plan is provided in Appendix A.

 

The overall acceptance test design for this evaluation also included provisions for retesting if the system failed its initial acceptance test.  However, these options did not prove necessary as the system passed acceptance on its initial sample.  More detailed discussion of the acceptance testing design is provided in the Detailed Evaluation Test Plan [Ref 5]. 

 

 

Field Testing

The acceptance test calls were carried out by MnDOT staff in the 22 FOT PSAPs over a ten day period in August of 2005.  In advance of the test, Battelle provided MnDOT with a sampling schedule that listed each of the PSAPs and a corresponding required number of sample calls.  This initial sample selection contained as close as possible to an equal sample size24 for each of the 22 PSAPs.  Upon receiving the required numbers of samples, MnDOT staff identified sample locations.  These were selected with some effort to provide geographic spread in the respective PSAP areas. 

 

On each sampling day, MnDOT staff proceeded to each planned location with a laptop computer (to simulate ACN/AACN data) and a portable OnStar test unit.  These are shown in Figure 3-1.

 

Figure 3-1. Portable OnStar Test Unit in Support of Field Test. Graphic with two photographs showing the voice routing field test equipment on a test vehicle.  The interior equipment includes a portable OnStar unit and a laptop computer for simulating OnStar calls from an instrumented vehicle. A cellular antenna is mounted on the exterior of the test vehicle.

Figure 3-1.  Portable OnStar Test Unit in Support of Field Test

The MnDOT staff placed a call to the OnStar call center.  OnStar established a three-way call with the PSAP of the originating call location as a wireless 9-1-1 call.  This was done in the FOT through a laboratory environment (Telcordia) simulating a call in a wireless service provider network.  The PSAPs were alerted that this was a test call so that they could defer it if there were any true 9-1-1 calls coming through at the same time.  The PSAP was then asked to identify its location and what data it had received on the incoming test call.  Figure 3-2 below provides a visual example of the data coming in to the PSAPs.  This specific image is unique to PSAPs with IES as the service provider.

 

For each sample call, the MnDOT staff recorded the following information on a data collection form:

 

 

Figure 3-2. Sample PSAP Incoming Call Screen (IES). Screen captures showing two examples of the computer interfaces in PSAP which display the call back number and call location information transmitted along a FOT test call and the additional crash data transmitted along with the FOT call, a feature available in IES supported PSAPs.

Figure 3-2.  Sample PSAP Incoming Call Screen (IES)

After completing each call, the MnDOT staff recorded required information on a data collection form then moved to the next sample location.  A sample data collection form is shown in Figure 3-3 below.

 

While there were no requirements in the acceptance test plan related to the sample time of day, MnDOT staff attempted to perform the test calls over a range of daytime hours.  Calling after normal working hours was not operationally possible.  Also, calling during morning or evening rush hours was deemed inappropriate since it had higher potential to interfere with true 911 calls due to the higher expected load.

 

Some issues unrelated to the FOT solution prevented successful calls from being made.  In these cases, either additional calls were made from the same location or alternative locations were chosen.  A discussion of those non-counted calls is provided in Section 4.1.1-Test Results. 

 

Figure 3-3. Example Data Collection Form from FOT Acceptance Sampling. The example shows the data collection form used in conducting the voice routing field test.

Figure 3-3.  Example Data Collection Form from
FOT Acceptance Sampling

Two special evaluations were conducted as part of the acceptance sampling.

 

Three PSAPs with IES as the 9-1-1 service provider (Kandiyohi, Renville, and McLeod) had the capability to receive ACN and AACN data with each call.  For these areas, the PSAP was also asked to read back the simulated AACN data that was transmitted to them when OnStar relayed the call to them.  These data included:

 

 

For this evaluation, a set of ten different simulated AACN data sets was produced.  Each of the ten was used at least once in the testing to simulate an emergency call from OnStar-equipped vehicle.  These data sets are stored on the laptop computer as part of the portable test unit as shown in Figure 3-1.  The ten AACN data sets are shown Table 3-1 below.

Table 3-1.  Simulated AACN Data Sent as Part of Acceptance Sampling

Data Category

Simulated AACN Data Sets

1

2

3

4

5

6

7

8

9

10

Delta V

17

32

10

32

22

22

32

20

20

20

Principle Direction of Force (PDOF)

-94

-91

176

-91

88

-88

-91

180

149

0

Multiple Impacts

yes

yes

yes

yes

yes

yes

yes

yes

yes

yes

Rollover

no

no

no

no

no

no

no

no

no

no

Driver Side Airbag Deployed

no

no

no

yes

no

no

no

no

no

no

Driver Front Airbag Deployed

near deployed

blank

blank

blank

blank

near deployed

near deployed

blank

blank

blank

Passenger Side Airbag Deployed

no

no

no

yes

no

no

no

no

no

no

Passenger Front Airbag Deployed

near deployed

blank

blank

blank

blank

near deployed

near deployed

blank

blank

blank

The PSAPs with capability to receive the AACN data had a separate application on their ALI.  This is shown in the bottom right hand section of Figure 3-2 above.  Note that Position, VIN, and Case ID are fields that were not evaluated.25

 

The Olmsted, Mower, and Steele PSAPs use the Mayo Clinic as a secondary PSAP and emergency responding entity (EMS).  Therefore, in these three areas, sample calls were routed on to the Mayo Clinic after completing the sample call to the PSAP.  Whether the Mayo Clinic received the correct call back number and latitude and longitude for the original call were separately recorded for these sample calls.

 

Though examined, successful ACN/AACN data transfer and Mayo Clinic call forwarding were not required for formally assessing passing or failing acceptance testing of the FOT solution.

 

At the conclusion of each test day, the data collection forms were scanned into PDF files and sent to Battelle.  In parallel to these sampling activities, Telcordia (who provided simulated wireless service provider network functions) archived the recorded date and time of each test call along with the transmitted latitude and longitude data sent from the OnStar unit.  These call records were also electronically sent to Battelle.

 

Battelle conducted a detailed audit by comparing the information on the data collection forms and electronic logs provided by Telcordia.  Discrepancies in the two records were discussed and resolved with the FOT team.  After this QA review, all sample call records were entered into an Access database.  The acceptance data are included as Appendix B.

            3.2.1.2            Evaluation Hypotheses and Measures of Performance

Table 3-2 provides a summary of hypotheses, measures of performance (MOP), and data sources to be examined in the voice routing system testing.

Table 3-2.  Hypotheses, Measures of Performance, Data Sources for Voice Routing Test

Area of Interest

Hypotheses

MOPs

Data Sources

Voice Routing System Reliability

In response to an OnStar call,

  • The OnStar Customer Service Representative can establish a 3-way call to the 911 system which goes to the correct PSAP considering the call location.
  • The 911 system at the responding PSAP will receive accurate transmission of LAT/LON, OnStar unit call back number, and additional ACN/AACN data
  • (Acceptance achieved) Estimated maximum system failure rate and confidence level
  • (Acceptance not achieved) Estimated minimum system failure rate and confidence level
  • Acceptance sampling results from simulated test calls

            3.2.1.3            Data Collection

Because of the liability concerns and logistical difficulties in sampling live OnStar 9-1-1 calls, the sample design utilized simulated OnStar calls placed in the field for evaluating the system.  The following test protocols were followed for the test evaluations:

 

 

Upon completion, the tester moved to the next specified location and repeated the above procedures.  At the end of the day, after properly recording and checking data, the field tester transmitted the data entry forms to Battelle.

            3.2.1.4            Analysis

The acceptance test plan was structured to incorporate the following concepts:

 

 

Determination of the tolerable rates for false acceptance and false rejection was made in consultation with the MnDOT FOT team and NHTSA.  Those rates were partially determined by the maximum number of sample calls that could be made due to budget or time constraints.  A consensus was reached with the FOT team and NHTSA that no more than 500 calls could ultimately be placed during the acceptance sampling test.

 

From these broad requirements, an acceptance sampling plan26 was developed that consisted of two phases:

 

 

Once the sample calls had been placed and data are collected for each one, the sample data collection forms were returned to Battelle.  Battelle then categorized each call as a success or failure with regard to system reliability and data integrity. 

 

Factors outside the control of the FOT were not considered in determining success or failure.  For instance, a call that was not completed due to lack of cell service was not evaluated as either a success or a failure and was omitted from the results.  These non-counted calls were discussed in the evaluation results to provide potentially useful information about the FOT.

3.2.2   Data Routing System Performance Analyses

As an ancillary objective of the FOT, live OnStar crash data in the state of Minnesota were automatically transmitted to the MnDOT SOAP server.  From there, they were available to CARS and other data servers.  The evaluation of the data routing system focused on reliability and latency of the data transfer between OnStar and the SOAP server.

            3.2.2.1            Evaluation Hypotheses and Measures of Performance

Table 3-3 describes hypotheses, measures of performance (MOP), and data sources to be examined in the data routing system performance testing.

Table 3-3.  Hypotheses, Measures of Performance, and Data Sources
for Data Routing Test

Area of Interest

Hypothesis

MOPs

Data Sources

Data Routing System Reliability

All OnStar incident data designed to be sent to the MnDOT SOAP server are sent reliably and accurately, and in a timely fashion.

  • Estimated proportion of OnStar data records accurately received by SOAP server
  • Estimated time to transfer data
  • Weekly summary report produced by OnStar of calls sent by type (SOS, ACN, AACN), number of failures, and average transmission time.

            3.2.2.2            Data Collection

During the FOT period, OnStar produced a weekly report tabulating the number of calls routed to the MnDOT SOAP server by type (e.g., SOS, ACN, and AACN) with counts of failed transmission records and the average transmission time.  These weekly reports were transmitted to MnDOT and were aggregated into a single spreadsheet covering the weeks of the FOT.  These data were provided to Battelle in support of the evaluation.

            3.2.2.3        Analysis of Data Transmission Logs

The data routing system was first activated on May 19, 2004.  The Field Operational Test period for the functionality began October 15, 2004 and concluded on September 1, 2005.  The basis for the evaluation of the data routing system performance is an Excel spreadsheet provided by MnDOT that summarized weekly data transmission performance over the period from May 25, 2004 to September 18, 2005.  To match the operational test period, this evaluation only covers the 47 weeks starting with October 11, 2004 and concluding with September 4, 2005.  Note that the reporting of data on a weekly basis prevented an exact match with the operational test period.  Within this 47 week period, six of the weeks are omitted because MnDOT had no record of receiving the automated e-mail containing weekly summary from OnStar.  This evaluation uses all available data for the remaining 41 weeks of the operational test period.  The data are shown for reference in Appendix C.

 

For each week in the period, the total number of Minnesota calls sent by OnStar and received by the SOAP server of type AACN, ACN, and SOS were tabulated, followed by the number of failures.  SOS calls are those for which an OnStar user pushed the emergency button and the OnStar Emergency Advisor established a three-way call with a PSAP.  ACN and AACN calls refer to those which are automatically made by the OnStar device when it detects a collision.  These calls transmit vehicle identification and location data as well as airbag deployment information.  The AACN includes additional fields such as Delta Velocity, Principle Direction of Force, and Rollover status.  Latency was recorded as the average delivery time in seconds for the transmitted data.

 

The following analyses were conducted:

 

3.3       User Acceptance and Deployment Issues Evaluation Approach

3.3.1   Objectives

The objective of this portion of the evaluation is to document the institutional and technical challenges of the FOT, with an emphasis on the data and voice routing solutions to broader, more ubiquitous applications, including, but not limited to public safety, emergency response, transportation management and related activities.  Complimentary to the quantitative assessment of system performance, as discussed in Section 3.2, perceptions of the FOT users provided valuable insights for gauging the success of the FOT.

 

The users assessed in this evaluation include PSAP operators, call takers and OnStar Emergency Advisors who potentially benefit from the voice routing feature of the FOT system; and emergency responders and traffic operation managers who might benefit from the data routing system that shares the OnStar ACN data via the CARS.  Acceptance and satisfaction with the FOT by these users were examined through personal interviews or small group discussions.

 

Equally important, this evaluation identifies and documents technical and operational issues that must be addressed in support of future deployment of a similar system.  Such issues are examined along the following basic FOT objectives.

 

3.3.2   Evaluation Hypotheses and Measures of Performance

Hypotheses as specified in Table 3-4 guided the analysis and were tested with the data and observations obtained in the user acceptance and the deployment issues evaluations.

 

3.3.3   Data Collection

            3.3.3.1            User Acceptance

Acceptance of the FOT application by the user group community was assessed through structured interviews with selected representatives of each user group, as described below.

9-1-1 Call Delivery and Processing

 

OnStar

Selected OnStar Customer Service and corporate representatives were interviewed.  Access to and the selection of those interviewees were closely coordinated with appropriate OnStar corporate representatives, as were the timing and location of the interviews involved.  The interviews were structured around a set of standard questions (as presented in Evaluation Findings, Section 4.0), consistent with the intent, scope and goals of the FOT.

 


Table 3-4.  Hypotheses, Measures of Performance, Data Sources for User Acceptance and Deployment Issues Evaluation

Areas of Interest

Hypotheses

MOPs

Data Sources

User Acceptance (OnStar )

OnStar Emergency Advisors perceive the process and application to benefit TSP call processing.

OnStar Management Representatives perceive the process and application to generally benefit corporate service goals.

OnStar prefers this solution to current practice.

OnStar believes this solution facilitates interaction with PSAPs.

Automatic communication of location (and, thus, routing of the call) is inherently less error-prone.

The total time of call processing is perceptibly reduced.

  • Perceptions of operational benefits in facilitation of call processing and timeliness, accuracy of call routing, and PSAP interaction.
  • Reduced system maintenance by eliminating the need to update PSAP administrative numbers.
  • Reported technical and operational issues associated with the FOT test.
  • Structured interviews with OnStar Emergency Advisors.
  • Structured interviews with OnStar Corporate representatives.

User Acceptance (PSAP)

PSAP call-takers perceive the FOT application to be beneficial.

PSAP call-takers prefer this solution to current practice.

PSAP call-takers believe this solution facilitates interaction with the OnStar Call Center.

Automatic communication of location (and, thus, routing of the call) is inherently less error-prone

The total time of call processing is perceptibly reduced.

PSAP Authority (cognizant 9-1-1 entity) perceives the process and application to be beneficial.

State 9-1-1 Point-of-Contact perceives the process and application to be beneficial.

  • Perceived benefits by PSAP and other 9-1-1 representatives, including the facilitation of call processing and timeliness, reliability of call routing, and OnStar interaction.
  • Reported technical and operational issues associated with the FOT test.
  • Depending upon available PSAP Management and Information System (MIS) capabilities and records, comparative test call duration with historical precedent (i.e., dialing into PSAP administrative line).
  • Structured interviews with PSAP call-takers and other 9-1-1 representatives.
  • PSAP MIS data, as available and appropriate.

User Acceptance

(Medical Responders)

Medical responders (Mayo Clinic) perceive the ACN data to be beneficial.

Medical responders (Mayo Clinic) are more informed with ACN data in response to the traffic-related incidents.

The ACN data are provided in a timely manner via CARS.

The vehicle location representation on CARS is accurate and useful.

  • Medical responders’ (Mayo Clinic) perception of benefits, including the facilitation of incident information sharing and access.
  • Perceived better decision making due to crash information provided by ACN.
  • Medical responders (Mayo Clinic) reported technical and operational issues associated with the use of ACN data.
  • Stated Medical responders (Mayo Clinic) willingness to promote and accommodate nationwide deployment of the FOT solution.
  • Selected structured interviews with appropriate Mayo Clinic representatives.

User Acceptance

(State Traffic Operations)

State traffic operation users perceive the ACN data to be beneficial.

Such users desire permanent deployment of the application.

The state traffic operation is more informed with ACN data in response to and management of traffic-related incidents.

The ACN data are provided in a timely manner via CARS

The vehicle location representation on CARS is accurate and useful

  • State traffic operation users’ perception of benefits, including the facilitation of incident information sharing and access.
  • Reported technical and operational issues associated with the use of ACN data.
  • Willingness to promote and accommodate nationwide deployment of this or a similar solution.
  • Selected structured interviews with appropriate State traffic Operation users’ representatives.

Expandability beyond MN

OnStar believes that the benefits of the FOT solution warrant its application beyond Minnesota.

  • Stated willingness to promote nationwide deployment of the FOT solution.
  • Technical and operational issues involving national deployment.
  • Selected structured interviews with user community representatives.

Expandability

(Multiple TSPs)

FOT Public safety users believe that the benefits of the FOT solution warrant its application to other Telematic Service Providers.

  • Perceptions of benefit ubiquity and application to other service providers.
  • Selected structured interviews with appropriate OnStar representatives.

NG9-1-1

The FOT solution is consistent with standards and related work currently being conducted on NG9-1-1 migration by NENA.

  • Consistency with NENA Future Path Plan, guidelines and standards.
  • Interviews with appropriate NENA representatives.

PSAPs and 9-1-1 Authorities

Selected PSAP 9-1-1 call-takers were interviewed.  These interviews were coordinated with appropriate PSAP management and 9-1-1 authority points-of-contact (POCs).  Again, the interviews were structured around a set of standard questions, consistent with the intent, scope and goals of the FOT, and were facilitated by MnDOT, as well as the POCs involved (state and local).  Interviews included PSAPs and 9-1-1 authorities reflecting an appropriate cross-section of the larger PSAP community involved in the FOT. 

 

Five PSAPs were selected, including:  Meeker County (IES), Olmsted County (Qwest), Anoka County (Qwest), City of Burnsville (Qwest), and the City of Minneapolis (Qwest).  The Mayo Clinic was involved, as both a secondary PSAP, and, as an ACN data user (emergency response).  IES and Qwest27 are the two companies that provide 9-1-1 trunk lines in the state of Minnesota.  These PSAPs provide a good mix of PSAP environments, including both 9-1-1 Service Providers (i.e., Quest, IES), service environment (i.e., urban, suburban, rural), and size of PSAP.

 

In addition, representatives of local and state 9-1-1 authorities were interviewed regarding their operational responsibilities and involvement with the FOT.

Information Sharing and Data Users

Mayo Clinic as a Responder

The Mayo Clinic and the Mayo Medical Transport, as emergency medical responding and treatment service entities, currently have access to real-time ACN data shared through MnDOT’s CARS.  Interviews with selected Mayo Clinic representatives were conducted and structured around both the nature of the data access, and the utility of the information.  Questions of utility emphasized both transport and treatment.  The scheduling of those interviews was facilitated by MnDOT.

 

Figure 3-4. Twenty-Two PSAPs Participated in FOT.   County-level map of the state of Minnesota showing the names and locations of the 22 PSAPs that participated in the FOT.

Figure 3-4.  Twenty-Two PSAPs Participated in FOT

 

MnDOT Traffic Operation Center

Any additional incident data are important to MnDOT in support of traffic management operations.  ACN or AACN data provide additional information for gauging the nature and severity of an accident, with which MnDOT could make informed decisions in diverting traffic or advising drivers of the delay using available traffic management resources (e.g., Dynamic Message Signs, web site, or 511).  To that end, interviews were conducted with MnDOT Traffic Operation Center personnel involved in consuming ACN/AACN-related CARS data for the purpose of traffic and incident management.  Questions were structured around the history of that use (to the extent that such data have been available), the nature of its access, and its utility to incident management.  Suggestions for enhancement were solicited as well.  The interview was conducted at the Roseville Regional Transportation Management Center (RTMC), a MnDOT facility specifically designed to support traffic and incident management, which combines maintenance dispatch, traffic security and operations, and is co-located with state public safety.

            3.3.3.2            Deployment Issues

The evaluation team reviewed the results of the test with key members of the FOT implementing team, and explored the opportunity and appropriateness of applying the FOT solution to other states and provider environments.  Interviews were conducted with technical contributors to the FOT solution, including both data and voice routing system designers and integrators.  Questions were structured around these results, and emphasized universal applicability in light of ongoing institutional and technical changes in the telecommunications industry.  The evaluation reviewed the “simulation” features of the voice routing components of the voice routing test and identified impacts that simulation may have had on test results.

 

The evaluation examined the consistency of the FOT approach with NG9-1-1 standards and operational requirements currently being developed by the National Emergency Number Association (NENA) and related organizations.  Future path planning for Next Generation 9-1-1 (NG9-1-1) infrastructure generally involves trusted IP-based networks, interconnected to provide interoperability, robustness, universality, and flexibility.  The delivery of 9-1-1 calls, along with the essential information and data both to support that delivery and to facilitate emergency response will most likely occur over the same network infrastructure.  While the functional nature of routing and delivery of any kind of 9-1-1 call will still occur over such a network, the way that is brought about will change.  The nature of NG9-1-1 network infrastructure may ultimately offer more effective pieces to the solution being examined, and that was explored in light of the “universality” question.

 

Specifically, reviews of the FOT solution were conducted jointly with NENA’s Technical Issues Director, appropriate NENA Technical Committee members and FOT implementing team members regarding this ancillary objective.

3.3.4   Analysis

            3.3.4.1            Assessment of User Acceptance

Interview data were analyzed to assess user perceptions about the effectiveness and utility of the FOT approach to accessing and processing telematics-related incident data and emergency requests.  Ultimately, user acceptance will depend upon user perception of three things:  whether that approach is believed to work, how well it is believed to works, and how universal users think it might be.  The analysis reconciled both the qualitative and quantitative results of the evaluation with these three questions.

 

For telematics service providers, successful implementation of FOT voice routing or a similar solution will provide a quicker and more accurate way to forward an emergency 9-1-1 type call to the “correct” PSAP as a native 9-1-1 call.  A similar technical approach can save time, is more efficient and less costly (obviates the need to maintain a separate PSAP access directory, for example), and ultimately provides better service to customers.  For PSAPs, such calls are delivered in a more timely way (a critical safety factor), are processed the same way other 9-1-1 calls are processed (i.e., over 9-1-1 trunks, and thus they receive the same priority), and are more likely to be delivered to the correct PSAP, and provided a call back number and location information (i.e., LAT LON).  Questions to both user groups helped validate these assumptions.

 

It is important to point out that the user acceptance analysis for the voice routing is likely to be limited by the “trial” nature of the FOT (i.e., OnStar 9-1-1 calls were simulated).  While that should not impact the basic question of the FOT (e.g., “does it work”), it will be a factor to consider in assessing “how well it works” in terms of operational improvement. 

 

Different from voice routing test, actual telematics incident data are made available in near real-time to various authorized users of MnDOT CARS.  The Mayo Clinic, for example, serving both secondary PSAP and first responder functions, is interested in telematics incident data for the sake of facilitating emergency response and treatment.  Questions regarding the usefulness for positive patient outcomes, and the support of early response are most relevant.  MnDOT, on the other hand, might be more interested in better information in support of traffic management during incidents, and faster incident clearance resulting from better coordination with response agencies (e.g., public safety, first responder, etc.).  Therefore, questions dealing with the utility of reported incident data (including, but not limited to data format, timeliness, user interface and accuracy) are important.

            3.3.4.2            Assessment of Deployment Issues

Feedback from the deployment team and third party technical experts was compiled and documented in support of decisions regarding the future deployment of potential FOT solutions.  The assessment of deployment issues addresses the same factors associated with deployment issues as are addressed for user acceptance:  Does the approach work?  How well does it work?  What is its potential for wider applicability?

 


22 County PSAPs include: Meeker, Renville, Kandiyohi, McLeod, Carver, Hennipen, Anoka, Washington, Dakota, Scott, Olmsted, Mower, and Steele Counties.  City PSAPs include: Minnetonka, Eden Prairie, Edina, Minneapolis, St. Paul, Eagan, Apple Valley, Burnsville, and Lakeville.

23 When conducting a statistical test with a binomial response (yes or no) versus a continuous response, the confidence bounds do not normally exactly meet the 95% that is “standard”.

24 Approximately 6 to 7 calls are allocated for each PSAP.

25 IES, one of the Minnesota 9-1-1 service providers, developed additional software to transmit ACN data to their serviced PSAPs via a separate network connection.  This feature is not part of the FOT scope. 

26 The sampling plan was approved as part of the FOT voice routing detailed test plan [Ref 5].

27 IES provides the 9-1-1 trunk line and maintains the selective router (SR).  Qwest provides the 9-1-1 trunk but selective routers are maintained by Intrado. 

Previous  | Table of Contents  |  Next