ITS Transit Standards Professional Capacity Building Program

Module 17: Accessible Transportation Technologies Research Initiative (ATTRI)

HTML of the Course Transcript

(Note: This document has been converted from the transcript to 508-compliant HTML. The formatting has been adjusted for 508 compliance, but all the original text content is included.)

Vincent Valdes: ITS Standards can facilitate the deployment of interoperable ITS systems, and make it easier to develop and deploy regionally integrated transportation systems. Transit standards have been developed by transit professionals like you at a national level to encourage competition and limit costs within our industry. However, these benefits can only be realized if you know how to write them into your specifications and test them. There are now a series of modules for public transportation providers that cover practical applications for promoting multi-modalism and interoperability in acquiring and testing standards-based ITS Transit systems.

Bruce Eisenhart: Welcome to Module 17. It’s the Accessible Transportation Technologies Research Initiative, also known as ATTRI, is what I’m going to be talking about today.

Bruce Eisenhart: My name is Bruce Eisenhart. I’m the vice president for operations at Consensus Systems Technology. I’m a system engineer by background. I’ve been doing that for 40-plus years, and I’ve been working in ITS for 22 years. I started with the National ITS Architecture, which I’ve been working on since 1993, and worked many, many regional architectures, which include transit, and throughout the United States. And I have worked a lot on ITS standards for traffic, transit, and connected vehicles. And for the last 10 years or so—10 or 15 years—been working on transit-related projects and system engineering, and also the area of Enterprise Architecture.

Bruce Eisenhart: Today, we’re going to be talking about ATTRI. And there are three learning objectives to this module: Understanding the background, vision, objectives, of ATTRI; then discussing ATTRI’s focus technology areas; and finally, finishing with describing some foundational considerations of the effort. The key part is the application areas they’re coming up with and the applicable standards of those application areas.

Bruce Eisenhart: So the first learning objective is to look at the background vision and objectives of the ATTRI program.

Bruce Eisenhart: So what is ATTRI? Well, ATTRI is a U.S. DOT multimodal research and development effort. It’s co-led by FHWA and FTA with support from the ITS Joint Program Office. And the key is that it’s conducting research to improve the mobility of travelers with disabilities through the use of ITS and other advanced technologies. The issue is to solve door-to-door accessible transportation issues for persons with disabilities. It’s hoping to maximize the benefits by having a coordinated federal investment looking at recent technology innovations and traveler-focused solutions.

Bruce Eisenhart: Who was ATTRI meant for? Well, persons with disabilities as I’ve said once or twice, and this is actually a fairly large group of people in the United States. In the most recent census in 2010, 19 percent of the U.S. population was listed as disabled in one of those four types of disabilities you see down in the chart: vision, mobility, hearing, or cognitive issues. And in terms of the number of people that fall into this category, older adults are a growing proportion of that total. We think the baby boomers as they enter retirement—there’s going to be an ever increasing number of older folks in the country that are going to have issues with vision and mobility and certainly cognitive and hearing—all four of those things. There’s three specific targeted populations for ATTRI. The first is just general persons with disabilities, but more specifically, looking at issues related to veterans with disabilities, and older adults. Within those four different types of disabilities—vision, mobility, hearing, and cognitive—there’s really gradations of challenges. If you think, for example, of someone who has been blind since birth has a very different level of mobility and independence than say a veteran returning from combat with a visual disability. So the idea of this is not just to consider one type of disability, but a whole wide range of disabilities that would relate to veterans, relate to older adults, and obviously people in wheelchairs as you see the diagram there.

Bruce Eisenhart: ATTRI is a research initiative. It has three phases to the initiative. Phase 1 is called “Exploratory Research and Partnership Development,” and that was concluded in 2015. A lot of what I’m talking about today is going to be the results of that Phase 1. Phase 2 is ongoing right now, and it is called “Application Selection and Prototyping,” because collaborations and partnerships. The idea of selecting a set of application areas to consider for ATTRI and then going and prototyping those efforts, that is ongoing at the present time. It’s scheduled to run through 2017, so roughly 2015 to 2017 is Phase 2, and Phase 3 will be some integrated demonstrations and pilots; roughly 2017 to 2019 for that part of the ATTRI effort.

Bruce Eisenhart: The DOT, when they put together the ATTRI program, created this vision statement which says that “ATTRI seeks to remove barriers to transportation by leveraging advanced technology to enable people to travel more easily, affordably, and effectively, regardless of their individual abilities.” And you see a picture of some kind of a mobility module there and something showing the symbol for people in wheelchairs with some mobility aspects to it. So that’s the vision.

Bruce Eisenhart: As a part of this effort, they identified three objectives for the five-year effort. The first was to explore the state-of-the-art technology solutions looking at both things in the U.S. and things in Europe, and that part’s been completed. Gathering stakeholder inputs on needs and solutions from users to incorporate into the future efforts: that was largely done in 2015, and it created a set of user needs that came out of there. And out of that, they identified four application areas for prototyping, and I’ll be talking more about those later in the module here.

Bruce Eisenhart: So an activity here is to look at the populations for which ATTRI is meant.

Bruce Eisenhart: The question is, “Which one of these is not a key population meant to be served by ATTRI?” And the answer choices were older adults, A; B, children; C, persons with disabilities; and D, veterans with disabilities.

Bruce Eisenhart: And reviewing the answers: the one that is not a part of ATTRI is children. Considering the needs of children is not a part of ATTRI. Considering the needs of older adults is one of the three groups they’re looking at, as is persons with disabilities—that’s a key population group—and also veterans with disabilities. We’ve had a lot of disabled veterans in the last decade from all of the wars that have been going on, and that’s a growing group of people that we want to consider the issues of their mobility as we get to the transportation system.

Bruce Eisenhart: The second part of this module is going to look at the ATTRI-focused technology areas. And this was the work that was done back in Phase 1 to identify what are some of the key technology areas that should be investigated as we look at creating transportation solutions for disabled peoples.

Bruce Eisenhart: There are five technology areas that I’m going to cover in turn in the next few slides: Wayfinding and Navigation Solutions, ITS and Assisted Technologies, Automation and Robotics, Data Integration, and Enhanced Human Services Transportation.

Bruce Eisenhart: So I’ll go through these one by one. Wayfinding and Navigation Solutions: A lot of smartphone-based navigation solutions are out there. To some degree, we use those today. The issue with these is designing them for people with blindness, low vision capabilities, cognitive issues, or mobility issues. The idea is to be able, with those devices, to provide pre-trip and en route traveler information—and not just as an app, but also including crowdsourced information that might go into that. Beacons or electronic tags that could interact with the built environment. Think people who have vision challenges and how they might be able to interact through beacons or tags with intersections—things like that. Multiple communications formats: visual, audible, haptic—which relates to touch—including in multiple languages. These are the kind of things that navigation technology can support. Another one is wearable technology; wearable, but needs to be discreet. The wearable technology can connect with devices already in use. For example, a haptic device that might interact with a white cane that people are using that are blind. And the third aspect of this is something called community navigators. Community navigators is a concept in the medical care field about providing support for patients with mental issues or drug issues. Basically helping them to secure services, but also for the aspects of ATTRI could consider the idea of community volunteers. It could provide more detailed information about neighborhoods so that people with disabilities can access that.

Bruce Eisenhart: As I mentioned with community navigators, the idea of people that could help provide additional layers of information around route-finding that people might do through a neighborhood. Under this wayfinding and navigation solution, some of the technology examples—you know, we’re used to using devices that give us routes. But the idea of indoor wayfinding devices—devices that might allow us to move inside of buildings. The example there shows something inside a hospital. Wearable devices that could provide guidance, another type of device you could see there.

Bruce Eisenhart: The second area is ITS and Assistive Technologies. ITS provides a broad range of wireless- and sensor-based communications. Real-time situational awareness: think of real-time situational awareness in transit systems; Information about the systems: next bus, route status, availability of accessible capacity; Data from users to the system: being able to access reservations, to make requests for specific kinds of disabled equipment that’s part of your access through the transit system. Accessible, assistive, and adaptive devices. Some of these could be the connected vehicle area—where you look at where vehicles can assist in the driving—all the way up to autonomous vehicles. Information and accessible communications formats: audible, visual, touch, haptic, different types of communications support. There’s also connected vehicle technologies that support for pedestrians: adaptive pedestrian signal timing, or emergency vehicle and safety alerts—basically the idea of a safety alert coming from the emergency vehicle to your handheld device or your smartphone that you can then give an indication there’s an emergency vehicle in the area. So a lot of different technologies from ITS and the assistive technologies.

Bruce Eisenhart: A few examples. I mentioned the connected vehicles. Smartphones, watches, or glasses that could interface with vehicles, with the infrastructure, or with pedestrians with each other. Think vehicle-to-vehicle if someone is in a vehicle, or a vehicle-to-infrastructure, or vehicle-to-pedestrian, or pedestrian infrastructure. All of those things are possibilities when we look at ITS and assistive technologies going through the network.

Bruce Eisenhart: The next area of technology is called Automation and Robotics. The goal here is to improve mobility for those who are unable or unwilling to drive, and enhance the independent spontaneous drive and capabilities. Vehicle automation technology could be used to solve some of that first mile/last mile mobility issues that would allow people to access the regular transit system directly from their homes. Things like collaborative robots in the area of robotics, providing concierge services, helping to get some robotic help to get you completely through the transportation network, to help you with the activities of daily life—walking—but also to connect you up to human transportation services. Machine vision, artificial intelligence, assisted robotics—even things like facial recognition software—all to ease people’s ability to go end-to-end through the transportation network.

Bruce Eisenhart: Some of the technology examples along this line: shared autonomous vehicles. The one on the right there is a shared autonomous vehicle from Japan that moves people around a campus setting. Assistive robotics shows the idea of a robotic—almost like a fancy wheelchair there—that will allow people to get through with limited mobility to some part of the transportation network.

Bruce Eisenhart: The next area of technology is called Data Integration—enabling the integration of data and information systems to give more in-depth accessibility information. We all have the ability to route navigate from Point A to Point B, but imagine if you’re blind and you need to know not just how to get from A to B, but you need to know what infrastructure you have to go by—what the points of interest are. Are there amenities? Where are the potential obstacles, either in real time or static obstacles that exist as you’re trying to go from A to B? Think about maps with layers that are below what we’re used to that can support the mobility challenge, layers that will provide an additional level of information that a mobility-challenged person might need to successfully navigate that. One of the ways that this could happen is through expanded user profiles with persons with accessibility needs. It’s usable by service providers. It would allow them to customize the service so that they could help people get from their Point A to Point B. Maybe they have additional alerts in there. Maybe they have this additional layer of map information based upon a user profile that says what kind of information you need to support your transportation through the network.

Bruce Eisenhart: One of the common examples of data integration is mobile apps that can integrate user mobility with accessibility needs across different modes, and some of these already exist. For regular travelers, the trick is to be able to expand these so that they work for people with disabilities as well.

Bruce Eisenhart: And finally, Enhanced Human Service Transportation. Real-time multimodal trip and service planning and traveler decision support applications for accessible transportation solutions. Integrating multimodal options and giving you both pre-trip and en route traveler information. Making that connection from paratransit to fixed route, linking the paratransit demand-respond and fixed route to increase flexibility, and options for travelers with disabilities. And having integrated payment systems where your data is not dependent upon the trip purpose; it’s just origin/destination-related, which provides privacy. But you get real-time capabilities to enhance that trip with additional capabilities that might be needed by the person to get from one place to another. Integrated payment—think of single smart card that would allow you to get in multiple different modes, multiple different systems to go end-to-end. You don’t have to carry a variety of different cards; and maybe it’s not only a smart card, but it’s a smart card that’s integrated with your wheelchair, for example. So you don’t have to have an extra device that you carry with you.

Bruce Eisenhart: A technology example for this enhanced human service transportation—smart cards or mobile apps allow you pay for transit services, applications that can link various transit services together—and some of those do exist currently in the United States. So those are the five technology areas that ATTRI looked at. What are some of the applications and solutions people might have out of those areas?

Bruce Eisenhart: And the following question is, “Which area was not identified as one of the five ATTRI technology areas?” And the answers were: wayfinding and navigating systems; integrated payment; automation and robotics; or data integration.

Bruce Eisenhart: And the one of those that was not specifically the five that I talked about was integrated payment. It’s actually considered a foundational consideration to ATTRI—which I’ll talk about in the next learning objective—but it wasn’t one of the technology areas. The five technology areas included wayfinding and navigation systems, automation and robotics, as well as data integration. Those were three of the five technology areas. So that’s a little bit about the technology areas that ATTRI was considering when it started the program back in 2014-2015.

Bruce Eisenhart: I’m going to now talk about the final learning objective: the ATTRI foundational considerations, application areas that were chosen, and then applicable standards that would apply to those application areas.

Bruce Eisenhart: Here’s a picture that ATTRI created, looking at the foundational considerations. Basically, what this means is all of the ATTRI applications that they’re going to look into should include these four cross-cutting considerations. They are: standard accessible data platform, universal design standards, integrated payment, and leveraging existing technologies. I’ll walk through these one by one and talk about what ATTRI means when they’re talking about these particular activities.

Bruce Eisenhart: The first is a standard accessible data platform. The idea here is to have access to real-time situational data sources. You’d like to provide for people almost ubiquitous access to a wealth of real-time situational data, including data specific to transportation systems, municipalities, points of interest, crowdsourcing information, and accessibility data. Data standardization and interoperability—it’s critical for developing these applications so that they can be used across multiple platforms across multiple systems. Now when I use the word “interoperability,” let me define that. It’s the ability of systems to provide services, accept services from other systems, and use the services exchanged to operate effectively together. Interoperability is quite important because it simplifies developing ITS systems and procedures and allows ITS tasks to be performed consistently. Examples of interoperability are being able to use the same toll tag on multiple toll runs, if you think of a non-transit example. But interoperability is a key, and the idea is that whatever comes out of this in terms of applications should have elements of interoperability to it. Data must work across service providers. If you’re going to utilize real-time data, you have to be able to communicate it across multiple applications. And to do this, part of the standardizations—you need some standardized data to create user profiles, so that you can you get smoother access and transfer between accessible transportation services. So, for example, a personal mobility vehicle summoned to pick someone up from the curbside would know about the individual’s disability type and tailor the services accordingly. One of the key areas for which standard accessible data platform can help out is the idea of extending services out of the cities—where there are so many services—to nearby or rural populations, which don’t tend to have nearly as many transportation services but contain many disabled travelers. So providing an interface between the urban transit agencies and the private services that are smaller demand-response services, requires the standardization of information, or standardization of a data platform.

Bruce Eisenhart: Universal design standards—new applications or leveraging of existing applications. The applicability of these should point towards all user groups; not just non-disabled people, but should be able to support the normal mobility people and the people who have mobility challenges as well. Universal design standards incorporates the philosophy that promotes the applicability of technical solutions to the needs of all user groups. The idea is that solutions have to address multiple communications formats and multiple user interfaces—interfaces that can support people with various disabilities. Likewise, in this one, we also incorporate the idea of user profiles and documenting needs of different stakeholder groups and ability groups to get through the transportation systems. Further, developers could focus on creating the same experience to enable transportation sharing using smartphones or wearable devices like smartwatches and fitness trackers, because travelers are increasingly using these kinds of devices in pedestrian environments.

Bruce Eisenhart: The next foundational consideration is integrated mobile payment—payment for transportation usable by travelers of any ability, and interoperable across all different modes. Interoperable electronic fare payment can be utilized across various modes by all travelers—including those with disabilities—at all times, and for multiple different purposes. Integrated payment solutions should accommodate all users, including those with mobility, vision, hearing, and cognitive disabilities. Where possible, payment solutions should integrate with an application or device—for example, the idea of having a technology embedded in a power wheelchair or maybe in a robotic device. The figure there shows the clipper card, which is an example of an integrated multiple payment device in the San Francisco Bay Area, which is used on multiple different systems within the Bay Area.

Bruce Eisenhart: And finally, for foundational consideration, leveraging existing technology. Apply existing technology to user needs, existing technology from ITS, from on-demand technologies, from mobile technologies, smartphones, other kinds of mobile technologies, assistive technologies, and wearables. The key in this is to use what exists or what is currently under development—we’re not out to start from scratch and develop new things that have never been developed before—basically taking things that are on the way and partially or completely out there developed or partially in development right now. Those are four foundational considerations that ATTRI looked at before it went and chose a set of application areas to go into prototype. So next, I’m going to talk about these application areas, and when I get to those, now I’m going to start talking about the standards aspects of this as well.

Bruce Eisenhart: How did ATTRI choose these application areas that would be considered? Well, they obtained a variety of inputs in three different ways. In 2015, they used to listen to the inputs through user needs webinars. There were three webinars that had 700 people on the webinars—plus a user needs workshop in March of 2015 that had 70 people there—that looked at detail at the user needs. And actually, as part of the workshop, they looked at different areas of applications, voted on them, and partially because of what came out of the voting were application areas they picked to go further with. Early on, they did a technology scan to identify current practices and accessible transportation, assistive technologies, applications, and systems for travelers with disabilities. And finally, they put out an RFI in 2015 to obtain informed views on the opportunities and challenges to development/deployment operation/use of accessible transportation applications.

Bruce Eisenhart: So using all of those, they identified the four top application areas that we’re going to move forward with in the ATTRI project. They are: pre-trip concierge and visualization; smart wayfinding and navigation; shared use, automation, and robotics; and safe intersection crossing. I’m going to go through each one of these areas, talk about the application area, and then talk about relevant standards within each of those application areas. So first let’s talk about pre-trip concierge and visualization.

Bruce Eisenhart: The idea with the pre-trip concierge, obviously is to provide pre-trip traveler information, but interestingly they considered en route traveler information as well when they were going and actually describing these application areas. But the key thing is it’s designed for people who are blind, have low vision capabilities, cognitive issues, or mobility issues. The second part of this should be visualization, allowing passengers to see their entire route with an app that includes landmarks; it includes any barriers they would have along the way. The idea of visualization is to allow them to remove the fear and facilitate independent mobility, giving you contextual details that could be augmented by voice overlays—the idea of a virtual caregiver to help plan route and track traveler movements, and possibly providing connectivity directly to a caregiver or to a family member. Think of the idea of the virtual caregiver as a voice-assisted movement through the network, where the voice is a family member’s voice providing someone the comfort of moving through the network with a family member, directing them to go here, there, and beyond. And not only just directing them offline, but also the idea of actually having connectivity to a caregiver or a family member as they’re moving through the network.

Bruce Eisenhart: Some of the application examples that ATTRI looked at when they picked this application area were things that could assist in everyday activities. Walking—basically the pre-trip concierge and visualization to help you walk or to get to work. Something that would have the ability to learn and remember routes. Something that could integrate different modes with accessibility options, distances, travel times. Looking at first mile/last mile options. I mentioned the virtual exploration devices to help the visually impaired—think somebody who is visually impaired being able to get a detailed plan for their trip that would not only include what streets to turn, but what stairs to take if you have to go down and do an underground system. What aspects of barriers you might have to go by? Is there a construction in the street where the sidewalk is temporarily closed and you have to move around that to—say, cross the street—to do something like that? Allow people to get a much more detailed view of their path of travel before they actually make the trip. I mentioned the voice overlays that can include family members. We’re sort of familiar with voice-assisted applications—think Apple’s SIRI. But I think the idea of having a family member as a voice overlay helping to direct a person as we go through the thing. And finally, they even have an idea of using emojis for accessible transportation. The figure on the right there shows an application that was developed to help people get around Penn Station in New York City which, if you’ve ever been there, is an incredibly large and complex structure. And not that this original application wasn’t focused just on people with disabilities; It was focused on anybody who needs to get around there. But think enhancing that, so the people with disabilities would have the ability to get themselves around these large, complex transit hubs that exist in cities.

Bruce Eisenhart: So on pre-trip concierge and visualization, what are some of the standards that would apply to that? Basically, we’re looking at ITS data standards that apply to static- or real-time transportation data. I’m going to just list them here, and then I’ll have a chart for each one of these to expand upon this. Static transit data is contained in either GTFS or TCIP, and I’ll describe GTFS and define it just in the next chart. There’s two different standards that cover real time data: GTFS and SIRI. And there’s a standard that covers traffic conditions as a way of improving the transit through the transportation network—to understand what the traffic conditions are out there—and it’s called TMDD. Each of these has its own modules, either in the PCB transit standards or, in the case of TMDD, in the original PCB standards efforts that we’ve done a couple of years ago. I’ll reference those as I go through.

Bruce Eisenhart: So let’s first talk about GTFS. That stands for the General Transit Feed Specification. It was originally called the Google Transit Feed Specification. It’s static data that shows routes and schedules on the routes. It was originally developed by Google, and it’s still maintained by Google. Although there’s a user community that works on the changes to it, Google is still the maintainer. I talk about standards, and GTFS occupies kind of an unusual position there. It isn’t technically a standard, because it’s not developed through a standards body through some deliberative standards organization. It’s a specification, but it’s a specification that is widely used. It’s now used by literally thousands of transit agencies in the U.S. and worldwide. Primarily used for support trip planning, when it was originally developed, Google did this with the intention of people providing transit routes and schedules to Google Transit. So you could get on Google Transit, and you could find routes and schedules for different transit agencies. That was really the primary user of it in its early days. It was originally developed in conjunction not with just Google but also TriMet, which is a big transit agency in Portland, Oregon. But in the last few years, more and more transit agencies have started to basically provide feeds in GTFS format that can be used by any third-party application, and third party applications have developed that use this information to provide a variety of trip planning activities. The supplement icon indicates there’s additional information you can find in the student supplement about GTFS. There is an entire transit PCB module—it’s Module 14, Part 1—that covers GTFS in much greater detail. So if you care to know more about GTFS and how your agency might apply that, I would want to point you to PCB module, transit PCB 14, Part 1.

Bruce Eisenhart: The other transit standard that handles static data is TCIP, which stands for Transit Communications Interface Profiles. This was originally developed as part of the NTCIP suite of standards—and I’ll talk more about those in a later chart—but a few years ago it was picked up by APTA: American Public Transportation Association. They are now the publisher and the SDO behind TCIP. TCIP defines standardized interfaces for the exchange of information or data among transit business systems, subsystems, and components and devices primarily intended for intra-agency use. Most of the interfaces are internal to the agency, but there is one area of TCIP relevant to this discussion, and there’s a passenger information area which also covers static schedules and routes: the idea of providing static schedules and routes out to passenger devices, or passenger signs, or things like that that you might have in the transit agency. It includes the static schedules and routes. You can find more information on this in the supplement. And, in fact, the PCB transit standards course has two modules that give you a detailed overview of TCIP—and those are Modules 3 and 4. So if you want to know more about TCIP, I refer you to those two modules under the transit PCB standards.

Bruce Eisenhart: The next one I want to talk about is GTFS-real-time. As the name implies, it’s a real-time version of the GTFS standard. It was launched in 2011 with six cities initially—and those cities were Portland, Oregon; San Francisco; San Diego; Boston; Madrid, Spain; and Turin, Italy. So it’s international in nature. This was intended to provide up-to-the-minute transit information for passengers. It was developed as a feed specification, much like the original GTFS. But something to provide real time information en route. Like GTFS, it is maintained by Google. Again, like GTFS, it is not technically a standard, it is a specification. And one of the things about specifications is they tend to evolve more quickly, because they don’t go through sort of a laborious standards update process. So if you go and look online—and you can find in the supplement the references online to get this information—you’ll probably find the most current version was a month ago or two months ago, because they routinely make small changes to this. It’s not as widely deployed, but the number of deployments in this are increasing dramatically. And every year, there’s a few more agencies that are using GTFS as a way to put out their real-time bus route information. GTFS real-time has basically three types of information. It’s something called a trip update, which provides an estimated time of arrival for stops on a trip. This is going to ask the question: when a transit vehicle is going to arrive or depart at a particular stop? So it’s stop-oriented. The vehicle position—so where are particular vehicles along the route? So this is a real-time bus location, if you will. And the final type of information is called alerts. Are there any planned or unplanned events affecting service, incident events, and things like that? This is going to answer the question about whether something affecting the service that you expect on a particular line. Again, originally designed to support Google Transit, but now people are starting to develop this as a feed that third-party applications can access and make use of.

Bruce Eisenhart: The next standard in the real-time data area is called SIRI, for short. The full name of the standard is in that first bullet: “Service Interface for Real-time Information Relating to Public Transport Operations”—shortened to SIRI. This is a standard under formal SDO control put out by the European Committee for Standardization, which is known as CEN. This actually provides—it says real-time data—but it provides actually a combination of both real-time and static data as part of this standard. There is increasing deployment in the U.S. on this. I don’t think it’s as widely deployed—certainly not as widely deployed as GTFS—but possibly more widely deployed than GTFS real time. It has what’s called “functional services” that are covered by it. When I talk through these, you’ll find it. This is actually a more generalized standard that covers things beyond what GTFS real-time would cover. The production timetable is basically the static schedule. It provides information about routes, and stops, and schedules as you go by routes and stops. The estimated timetable is the real-time aspect of what’s happening with that. So in this case, it’s the actual progress of vehicle journeys operating specific service lines. It gives you expected arrival and departure at specific stops on a planned route. So this gives you something like you get with GTFS real-time. What’s the planned arrival and departure at specific stops? So kind of stop-based. There is a stop timetable, which is a static stop-centric view. It can be used to reduce the amount of information that needs to be transmitted in real time. So some people choose to use a stop timetable—here are the stops, when the bus is supposed to be at the stop—and that’s called the stop timetable. And then there’s a stop monitoring set of messages that tell you when is the bus actually expected at that stop, so it’s a stop-based timetable, if you will, in real-time. Then they have a vehicle monitoring—a vehicle timetable—and that gives you the vehicle timetable where it will give you the static schedule where vehicles are supposed to be on routes. And then the monitoring will be where is the vehicle actually on the route? They also have a connection timetable in here, and a connection monitoring that allows you to provide information about scheduled arrival of a feeder vehicle to the operator of a connecting distributor service, so they can then guarantee the connection, which helps something called connection protection if you do the timetable and the monitoring of that. So when are buses supposed to get at feeder locations, and when they actually expect to get to those feeder locations? So focusing just on the connections. We are seeing increasing deployment of this within the U.S. It’s not uncommon to find agencies using data based on SIRI as opposed to based on GTFS real-time.

Bruce Eisenhart: And the final one here is a little bit different. It’s the traffic data standard called Traffic Management Data Dictionary. And although it’s called a data dictionary, it actually defines data elements; data frames, which are made up of data elements; messages, and the messages are made up of data frames and data elements; and dialogs, which defines a sequence of messages. This is currently in version 3.03. It is jointly done by ITE—Institute of Transportation Engineers—and AASHTO—the American Association of State Highway Transportation Officials. It is a center-to-center standard for exchanging transportation information between traffic management centers and other centers. So this one is not focused outward on passengers. This is more of a center-to-center kind of standard, but this is a key piece of gathering that overall transportation network situational data that’s relevant to a lot of the things we’re talking about. So this is how you get some of that situational data from the transportation network. Real-time information about the road network conditions, about incidents—it’s widely deployed by state transportation departments, and it supports trip planning as well. There’s more information on this—as well as connect directions on how to find the standard and stuff like that—in the supplement. Note also the supplement calls out three different PCB training modules that were developed to support this. They’re Modules 12, 16, and 30, and those are part of the original set of PCB Standards Training Modules that cover not transit, but more traffic-related activities. So those are the standards that address that first area of the concierge and visualization. Notice there were no specific standards on visualization. None of those had been developed.

Bruce Eisenhart: Now, if we look at smart wayfinding and navigation systems. For navigation systems, obviously we have smartphone-based navigation systems. Most people are used to the outdoor use, but we also have the indoor use. I showed you the example before of the wayfinding in Penn Station. The beacons are electronic tags that you can interact with the built environment. Multiple communications formats—including multiple languages—all part of the navigation systems: the wearable technologies, wearable discrete assistive navigation tools that can connect with assistive devices like wheelchairs, smartphones, and canes—things like that. And then I mentioned the community navigators, who provide data in the neighborhood, to help people navigate through that, and you can see a picture on the bottom there—the idea of trying to navigate through a neighborhood that has temporary construction up that has a sidewalk block. Imagine if you’re a blind person, how difficult that is to navigate through streets where streets are closed off or sidewalks are closed off because of temporary construction that’s going on.

Bruce Eisenhart: Some of the application capabilities, and in this case as ATTRI, they defined not so much examples but more like capabilities that would be important for smart wayfinding and navigation systems. First off is to recognize and detect stationary objects—doors, elevators, stairs, crosswalk, traffic lights. Obviously a key thing if you are trying to help people with visual issues get through a network. Read and recognize important text and signage based on the user’s query. Now, imagine, there is temporary signage up there. How is a person supposed to know what that is? Well, you basically have something that is able to read the signage and provide it verbally to the user. Detect and track and represent moving objects. You know, often we talk about static issues that people have. Well, how about moving issues? Dynamic changes to a traveler’s environments—people, shopping carts, doors opening, moving vehicles—all things that technology could help a person to better navigate a network if they have vision issues. Providing one button push notification of location that allows you to send location information from your smartphone to a bus or a van. So think about connecting a person with disabilities up to that first mile/last mile or up to a demand response. It’s some way for the person with the device they have—like a smartphone—to just say “Here I am. I’m on the other corner, not on this corner.” And again, the wearable sensors. So think about things like cameras, three-dimensional orientation devices, pedometers that can be used in conjunction with display units to provide auditory or tactile guidance.

Bruce Eisenhart: Some of the standards that apply here when we talk navigation systems—the standards are really the same kinds of standards one would have when we’re looking at that pre-trip concierge. So that same set of standards have already talked about GTFS, GTFS real-time, SIRI, and TMDD. I mentioned that wearable technologies was an aspect of this wayfinding in the navigation systems, and interestingly, ISO is developing standards for what’s called haptic and tactile interactions. I’ll talk a little more about this on the next slide.

Bruce Eisenhart: ISO is developing a set of standards. It’s ISO 9241-9xx. The only one that they’ve actually published so far is 910. So it’s ISO 9241-910, and its title is a “Framework for Tactile and Haptic Interactions.” It contains terms. It contains interactions. It contains a description of the devices that you have to support these. And haptic covers—there’s kind of a subtle distinction between haptic and tactile. Haptic covers all forms of touch sensation. Tactile is a more specific term, and in the standard, it’s basically referred to as the mechanical stimulation of skin-vibration. Think vibration in something. But haptic is a more general term they use for other kinds of touch-type sensing that people could have. So if you’re interested in something in the way of the wearable technology there is an ongoing activity, and they expect in the coming years to develop multiple standards that would address different aspects of this wearable technology through ISO.

Bruce Eisenhart: And one additional standards activity that is quite recent is something called the Wayfindr Standard. This is an open standard for digital wayfinding on mobile devices through audio-based navigation, basically standardizing the audio navigation one would use to help a blind person or a visually-impaired person through the transportation network. It’s been developed by a not-for-profit venture of a company called ustwo, which is a digital product developer who interestingly developed videogames—they have a couple of well-known video games—and RLSB, the Royal London Society for Blind People. So this is an effort that is started in the U.K. and they have developed an open standard that you can access that has a set of design principles. It has some guidelines for navigation instructions—audio-based navigation, a discussion of some of the technology best practices, and they actually have a demo mobile app called Wayfindr that you can use to support people giving audio-based navigation. As I looked at it, it looked like it’s an app through things like the London Underground system where, if any of you have ever been there, that’s an incredibly maze-like thing you have to walk through to get yourself from one place to another in all of those underground stations. So if you’re interested in audio-based navigation, there is this very recent standard that literally the candidate recommendation came out this month—December 2016. So that is hot off the presses as it were.

Bruce Eisenhart: The next application area—Shared Use Automation and Robotics. And this shows a few examples of that on the slide there. Assistive and collaborative robotics to enhance mobility that can not only assist the activities such as walking, but could work with individual travelers and human transportation services to help provide some of that concierge kind of activity for different stages of travel—maybe improve personal mobility as they go through there. Ability to plan and execute trips, associated services. Transformative transportation alternatives—some unusual things that could be done all the way up to the idea of the automated vehicles. In fact, I saw just the other day, I think a Google car—you see a picture of the Google automated car there—had a blind person transported inside the car as the only occupant of the automated vehicle. There have been several robotics-type activities that have been defined. A robo ped—automated robotic characterization of pedestrian zones—finally gets some robotic support for people crossing intersections, like a crosswalk assistant. A robo slow speed automated vehicle connectivity—and that’s what that SAV is down there on the bottom right corner; you saw that once before. And of course, it says in the middle the infamous Google car—or famous, I should say.

Bruce Eisenhart: Some of the applications of autonomous vehicles to transit. Here’s a recent diagram that came from one of ATTRI’s presentations that they made in Public Roads—if you are a person that reads the magazine Public Roads. The idea is that it’s kind of a personal mobility device basically running autonomously. It could support first mile/last mile. And it says application controlled areas—think a university or business campus—getting you from the door of your business, over the hundred yards or whatever it might be, to where the actual bus stops to pick you up. In this area of automation and transit, there was a very large project in Europe, too, called CityMobil2. It was an EU effort that looked to consider automation and transit, and it was concluded in June 2016. So if you want to learn more about some of the activities over there and how they used automation to support people moving through transit, check out CityMobil2.

Bruce Eisenhart: When we start we think about the standards that relate to shared-use automation and robotics, let me first distinguish between the idea of automated versus autonomous vehicles. Automation is sort of a continuum of advances, and that’s exactly what we have today in the present line of vehicles. Each year, vehicles put out some additional attributes of automation. It’s quite common now to have automated parking assist—basically the car parks itself for you. Gone are the times we have to pass a driver’s test and park. Autonomous is the term used for the end-state when the vehicle is self-driving. So you think of automated as a continuum and up to autonomous. Now, in fact, SAE—the Society of Automotive Engineers—is a big player in this, and they have developed a definition of levels of automation that go all the way up to Level 5—and Level 5 is basically the autonomous vehicle that drives itself. The standards that apply in this—the primary one, the one I’m going to talk about a little bit about—is the dedicated short-range communications connected vehicle standards SAE J2735 and J2945. These apply to data that goes in and out of a vehicle, and they’re being used as part of the connected vehicle effort that’s been going on for several years to increase the level of automation, as it shows there. In addition, there are several other activities that are underway looking at various levels of automation all the way up to autonomous vehicles. The first one, IEEE P2040, is an effort to develop standards that cover not only connected vehicles but also automated vehicles. J2735 is connected vehicles; it’s not really automated vehicles. But IEEE P2040 is looking at going beyond automation all the way up to the autonomous. And the other one is SAE’s on-road, automated vehicle systems—ORAV. This is an SAE taskforce with the purpose of designing a reference architecture for autonomous wheeled ground vehicles—military and civilian, passenger vehicles, and trucks, on and off the road—to look at SAE automation Levels 3 to 5. Five is complete autonomous vehicles; three is very highly automated vehicles. They’re going to identify this architecture. They’re going to look and possibly influence standards and extend existing vehicle messages to enhance interoperability systems and components. This effort has been ongoing for a couple of years. The supplement gives you information about it if you want to know more where to look at that. And finally, FHWA has an automated vehicle research program that is looking at some of the issues with vehicles as we move up towards the fully autonomous vehicle.

Bruce Eisenhart: I want to talk a little bit now about the connected vehicle—the data in and out of the vehicle that’s supported by the SAE standards 2735 and 2945. But first, let me make a couple of definitions. I mentioned dedicated short range communications. This is the idea of a short range—and by short range, we’re talking a kilometer—communications from the vehicle either to other vehicles or to roadside units. And to accommodate this, there’s something called an on-board unit—or OBU—that broadcasts a set of basic data, such as vehicle’s location, speed, direction of travel, and it can receive data from other vehicles or from the infrastructure. Now, the infrastructure has something called a roadside unit that can receive a set of basic data from the OBE on the vehicles, or it can broadcast data either to vehicles or to other mobile devices. And those other mobile devices could be smartphones that pedestrians are using. And it shows—if you look at the figure there—the idea of vehicles having this sort of wireless area around it. They can transmit and receive information, and there’s been a whole variety of applications that people have come up with that are safety-related, and some are mobility-related, that are either vehicle-to-vehicle or vehicle-to-infrastructure.

Bruce Eisenhart: So this standard J2735 was developed and published by Society of Automotive Engineers. It defines the messages and the data elements for connected devices. So it’s vehicle-to-vehicle. There’s something in there called the basic safety message. That’s what I referenced in the page before that talks about—sometimes people call it the “Here I Am.” What’s your location? What’s your speed? What’s your direction? But it includes a lot of other information as well. In terms of vehicle-to-infrastructure—or rather infrastructure-to-vehicle—there’s something called Signal Phase and Timing. The signal phase and timing is the message coming from the roadside that tells you what is the phasing and timing of the traffic slow controller as you approach the intersection. Is it how many seconds will it be before the green light turns red? Or how many seconds before the red light turns green as you’re coming up to it? And there’s another message that’s defined in 2735 called the Traveler Information Message. This is a location-specific traveler information message that can be transmitted from the roadside to vehicles that come by to provide traveler information that’s very local in its interest and its use. One of the things that is driving the interest in this is that NHTSA is going to release a rule and they just, in the last month, released the NPRM, notice of proposed rulemaking, that will define that every new vehicle sold in the United States will have this DSRC capability with this BSM capability to transmit its location and other information about its movement. So a few years from now, when that rule is finalized—if it is, and it’s moving in the direction that it will be—then all new cars will have this kind of capability that we’re talking about. So J2735 is a message-set dictionary. It describes the messages and the data elements that make up those messages.

Bruce Eisenhart: The second standard or series of standards that SAE is developing is called SAE J2945, and the title is “DSRC Minimum Performance Requirements.” So they’re looking at the actual performance requirements needed for these different messages. The only one that has been published—and it’s been published in draft version—is J2945/1, which are the performance requirements for the V2V safety applications, and that was done kind of fast-tracked because they want the support that NHTSA NPRM that just came out to define not just what the messages are, but what are the actual performance requirements to send information from vehicle to vehicle. They are working right now on several other ones. They’re working on J2945/0, which is going to find some common requirements for DSRC, and also set up a structure for how all the other documents will be done. Work is also ongoing on 45/6, which are performance requirements for a cooperative adaptive cruise control and platooning. Cooperative adaptive cruise control is kind of a smart cruise control that can adjust the distance between yourself and other vehicles using the vehicle-to-vehicle information—not just the radar information within a single vehicle, which is what most of them do today. And platooning allows multiple vehicles to travel basically on a string with the movements stopping and starting being directed through the transmittal of information from vehicle to vehicle. The other activity that J2945 is undergoing is called “Performance Requirements for Safety Communications for Vulnerable Road Users.” This is one that is certainly applicable to what we’re talking about in ATTRI with people with disabilities. It’s looking at the information between vehicles and handheld devices, or between the infrastructure and handheld devices. So it defines those data exchanges.

Bruce Eisenhart: The final application area is called Safe Intersection Crossing. And the idea here is intersection crossing assistance for travelers. Pedestrians can interface with traffic signals, with vehicles, or other nomadic devices. We see an example there of a person with a wristwatch on that could have a connection directly to the intersection, or a connection from that to other vehicles. The idea is it’s designed for people with blindness: low vision, cognitive mobility issues. The same kind of things we talked about before—maybe beacons or tags that interact with the built environment and multiple communications formats. Providing guidance notifications and alerts in those different formats to assist pedestrians, and navigate safely through intersections and focus on providing precise and concise information when it’s needed so that they can adjust their movement through the intersections.

Bruce Eisenhart: In terms of the standards to support this intersection area, there’s really two—and one we’ve already talked about, which is the 2735, 2945. As I mentioned before, 2945/9 is looking at the interfaces between the infrastructure and mobile devices, or between vehicles and mobile devices. Because we’re talking about intersections now, the NTCIP standards are being updated to address those.

Bruce Eisenhart: And here’s a little information about NTCIP. It stands for National Transportation Communications for ITS Protocol. It’s a series of standards that have been around for quite some time, primarily addressing field device interfaces. So think traffic signal controllers, dynamic message signs, CCTVs. NTCIP is created and maintained by joint committee of AASHTO, ITE, and NEMA—the National Electronic Manufacturers Association. There are many different NTCIP standards which are freely available—I think in the supplement we have information on that as well—but the one called 1202 is called Object Definition for Actuated Signal Controllers—not that people in transit are going to get into this, but this is the standard that relates to how signal controllers are operated. And this is being updated to address connected intersections—the kind of things that we’re talking about right now. So those are the standards that apply to the fourth of the application areas called the Safe Intersection. So those are the four application areas.

Bruce Eisenhart: What is the road ahead? What’s going on now? I mentioned in the very beginning that the output of Phase Two of ATTRI was going to be prototyping of the application areas. And the way they’ve gone around this is through FHWA and FTA put out a broad agency announcement that addresses the three application areas of smart wayfinding and navigation systems, free trip concierge and visualization, and safe intersection crossing. It’s a $7.5 million prototype program with multiple awards. Now those, many people have provided responses to the broad agency announcement—the RFP that went out. That RFP was due in in August of 2016. Here we are December 2016. They haven’t yet announced the winners of it, but when they do, you can find that information on the ATTRI website, and the ATTRI website is listed in the student supplement. You notice there were four application areas but the BAA only addresses three of them. The fourth application area was addressed through an RFP that came out of a Health and Human Services group with the very long acronym of the National Institute on Disability, Independent Living, and Rehabilitation Research. They put out an RFP to request people to develop application prototypes for automation and robotics to enhance accessible information. This particular one is maximum $500K awards, and they have a total $2.5 million in the program. So a minimum, I guess, of five awards that they will give. It also was due at roughly the same time in August of 2016, and they also have not yet announced the winners of that. Look at the ATTRI website to see who those winners are. So in terms of the road ahead, the first step is prototyping. Sometime in the next month or two, they should announce the winners of that, and the work is approximately a year long, so 2017 will be the year for these prototypes to be developed in the four areas. Once that’s done, then Phase Three of ATTRI will kick in, and they will do integrated demonstrations and pilots of those different prototyped applications.

Bruce Eisenhart: So that’s the road ahead for ATTRI. There are additional implementation issues that they’re aware of and things that—I mean, I can’t say that they have a specific activity, but they know that these are additional activities that need to be developed as part of ATTRI. Integration of the planning process—the need for planners and others developing services to consider this as part of the coordinated human services transportation plan and service delivery efforts. The need to include strategies to improve mobility as part of your basic transportation planning activities—improving mobility of these different population groups, including many who are isolated and underserved. In this particular area, there’s an associated activity called Smart Cities that was a $75 million FHWA research activity that had something on the order of 75 proposals put into it and they down selected to a handful. They went through an initial development activity and in this past year, they made the final selection of Columbus, Ohio, as the winner of that, and they’re going to go on to develop their actual transportation solutions. But I mentioned them here because of this idea of underserved areas. One of the particular aspects of Columbus’ winning proposal was that they were going to develop increased transit accessibility for areas of Columbus that are underserved to allow people to get from those to the jobs in other parts of the Columbus area. And while travelers with disabilities is an aspect of that—it’s not the primary aspect—but it has, in fact, been considered as part of this Smart Cities activity that’s going on. And the idea of new and expanded standards to support this implementation. We have some standards in place for transit data—very few standards activities in the area of the actual support for the disabled movement to the transportation network. There was the Wayfindr one and also that one ISO one for the haptic devices that is being developed. So they recognize there is more standards work that needs to be done. And that’s just one of the other issues ahead they’re going to have to consider.

Bruce Eisenhart: So, I talked in this learning objective about standards.

Bruce Eisenhart: Many of these are standards that go through formal standards development processes but one of these here is not a formal standard as is typically defined. And the choices were Google GTFS, APTA TCIP, CEN SIRI, or SAE’s J2735.

Bruce Eisenhart: So the one that is not a formal standard is Google GTFS. It’s a de facto standard. It’s really used by thousands of people. But it doesn’t undergo the formal standardization process that we normally associate with standards. ATPA TCIP has the formal standardization process, as does CEN’s SIRI, as does SAE J2735. So those three are formal standards. Google GTFS—not a formal standard—a specification, but one that’s very widely used. That is the module about ATTRI.

Bruce Eisenhart: And what are some of the things that I covered in this? Well first, what is ATTRI? It’s a multi-year effort to identify solutions to solve door-to-door accessible transportation issues for persons with disabilities. It looked at five technology areas. Those were Wayfinding and Navigation Systems; ITS and Assistive Technologies; Automation Robotics; Data Integration; and Enhanced Human Service Transportation, which, based on user inputs, were used to define four application areas that will be prototyped. And those four application areas were Pre-trip, Concierge, and Visualization; Smart Wayfinding and Navigation Systems; Shared Use, Automation, and Robotics; and Safe Intersection Crossing. So that is ATTRI as it is today, and if you want to know more about it, look at the student supplement and access the U.S. DOT’s ATTRI website, where you will find the most up to date information on it.

Bruce Eisenhart: With that, I thank for completing this module. We welcome your feedback. Please use the feedback link below to provide us with your thoughts and comments about the value of this training. Thank you very much.