Module 36 - T317

T317: Applying Your Test Plan to NTCIP 1205 Standard

HTML of the Course Transcript

(Note: This document has been converted from the transcript to 508-compliant HTML. The formatting has been adjusted for 508 compliance, but all the original text content is included.)

Nicola Tavares: Welcome to the ITS Standard Training

Ken Leonard: ITS Standards can make your life easier. Your procurements will go more smoothly and you'll encourage competition, but only if you know how to write them into your specifications and test them. This module is one in a series that covers practical applications for acquiring and testing standards-based ITS systems.

I am Ken Leonard, director of the ITS Joint Program Office for USDOT and I want to welcome you to our newly redesigned ITS standards training program of which this module is a part. We are pleased to be working with our partner, the Institute of Transportation Engineers, to deliver this new approach to training that combines web based modules with instructor interaction to bring the latest in ITS learning to busy professionals like yourself.

This combined approach allows interested professionals to schedule training at your convenience, without the need to travel. After you complete this training, we hope that you will tell colleagues and customers about the latest ITS standards and encourage them to take advantage of the archived version of the webinars.

ITS Standards training is one of the first offerings of our updated Professional Capacity Training Program. Through the PCB program we prepare professionals to adopt proven and emerging ITS technologies that will make surface transportation safer, smarter and greener which improves livability for us all. You can find information on additional modules and training programs on our web site ITS PCB Home

Please help us make even more improvements to our training modules through the evaluation process. We look forward to hearing your comments. Thank you again for participating and we hope you find this module helpful.

Nicola Tavares: Throughout the presentation this "Activity" slide will appear indicating there is a multiple choice pop quiz following this slide. The presentation lecture will pause at each quiz section to allow you to use your computer mouse to select your answer. There is only one correct answer. Selecting the submit button will record your answer and the Clear button will remove your answer if you wish to select another answer. You will receive instant feedback on your answer choice.
Please help us make even more improvements to our training modules by completing the post-course feedback form.

This module is T317: Applying Your Test Plan to NTCIP 1205 Standard
Your instructor is Joey Yang is a senior ITS project manager at HDR and he has 16 years of experience in the ITS and transportation field. He has involved in the design, implementation, integration and deployment of many ITS projects in the US. Over the years, Joey has been involved in many system development projects and NTCIP testing including the New York City's advanced solid-state traffic controller project. The next voice you will hear will be of your instructor.

Joey Yang: Hi. Welcome everyone to this webinar today, Applying Your Test Plan to NTCIP 1205 Standards. This training module was designed for the following audience: engineering staff; and operations and maintenance staff, including agency staff and CCTV system designers who have engaged in the CCTV system testing in the past and have designed the CCTV system; the system integrator, device manufacturers; testing contractors; installation contractors; as well as construction inspectors will also benefit from this training module.

Joey Yang: Prior to taking this module, we suggest you take the following training modules if you have not done that. There are three T modules-- 101, 201, and 202-- that provide the basic information on NTCIP standards testing, and how to develop a test plan and test documentation. The C101 module provides a good introduction of NTCIP protocols and their use in ITS. Please note that some NTCIP standards that were developed earlier did not follow the system engineering process. NTCIP 1205 is one of those NTCIP standards that does not follow the system engineering process, and therefore there is no user needs and requirement defined in the standards. Participants are strongly suggested taking the previous two CCTV modules, A317a and A317b, in order to understand the process of developing user needs and requirements prior to taking this module. There are additional technical references that are preferred prior to participating in this module. That includes the systems engineering process, NTCIP 1205 IEEE 829 software test standards, and NTCIP 8007.

Joey Yang: There are two possible paths a user can follow in implementing system-based ITS standards. The path to be taken depends on whether or not the ITS standards were developed in accordance with the SEP. As stated earlier, the CCTV standards was developed without SEP. This slide shows now system engineering process-based modules that you should take in the suggested sequence. The three sequential modules for the CCTV system are A317a, A317b, and T317. If you take all modules in the above sequence, you will be able to understand the CCTV system deployment process and what are required to develop user needs, system requirements, test plans, and test documentation.

Joey Yang: The 300 modules take one step further and describe the testing process for each specific NTCIP standard. A few 300 modules have been developed and more modules are on their way. This module T317, highlighted on this slide, is one of the T300 modules.

Joey Yang: There are five learning objectives for this training module. At the beginning, this module will help the participants to understand what the role testing plays in the system lifecycle and the testing to be undertaken. This module will also help review the purpose, structure, and the contents of well-written test plans and associated test documentation required for NTCIP testing. There are differences between test plans and the test documentation, which often cause confusion to people, and we'll go through the details about those differences. Test documentation includes NTCIP test procedures as included in some of the NTCIP standards, while test plans are generally a management-level document for testing purpose and are not included in NTCIP. We will talk about it more in learning objective two and three when we discuss test plan and test documents. Then this module will demonstrate the process of developing test documents using a sample requirement to test case traceability table. At the end of this module, we will spend a little bit of time to describe how to address the consequence of different test conditions and what test tools are typically used for NTCIP testing.

Joey Yang: As part of the first learning objective, we are going to discuss the purpose of testing and briefly reveal the concept of system lifecycle and the testing to be undertaken. Then we will study the verification methods that are used for testing. At the end of this learning objective, we'll discuss the relationship between the test process and the system lifecycle.

Joey Yang: How do we know a CCTV system will work as intended? According to IEEE 829 standard, testing of course provides objective evidence that the system and its associated products satisfy the allocated system requirements, solve the right problem, and satisfy the intended use and user need. So for a CCTV system, the testing needs to be accomplishing two things. First is to verify the CCTV system meets the agency's specification requirement. Was the system built right? And the second is to validate the system against the user needs and solve the right problem. Did you build the right system? Testing is a process in order to answer these questions.

Joey Yang: Let's review together the concept of system lifecycle. The system lifecycle consists of multiple lifecycle processes that can be divided into two major processes. On the left side of the V model is the decomposition and the definition process, and on the right-hand side is the integration and recomposition process. The decomposition and the definition process, often called the top-down process, consists of the lifecycle processes from regional architectures, the concept of operations system requirements, system design, to field deployment. The integration and recomposition process, also called the bottom-up process, consists of the steps from field deployment, unit testing, system verification and validation, to operation and maintenance, change management, and all the way to system retirement. You must have noticed in this chart that user needs and requirements are normally developed using the decomposition and definition process as shown on the left side of the V chart, and testing takes place during the integration and recomposition process as shown on the right side of the V chart. Let's take a look at each individual step in detail regarding the testing. The unit testing is to test a standalone device, while subsystem verification is to test a system interface and its immediate environment, typically on the laboratory or central environment. System verification and deployment is to test the entire system interface, including the TMC software. And last, the system validation process is to ensure the system satisfies all user needs that has been defined in the ConOps and the system requirements, and so on. The previous testing modules, T101 and T201, have described the system lifecycle in details. You may refer to these two modules for additional information.

Joey Yang: Let's take a look at the system lifecycle from a different perspective. On the left side of the V model, we have user needs and system requirements, defined in early stages of the system lifecycle. On the right side, we have system verification and system validation, defined in the later stage of the system lifecycle. As mentioned in earlier slides, testing is to ensure a system satisfies a set of predefined user needs and requirements. In order to determine if the user needs and requirements are fulfilled, the system under test needs to be verified against system requirements and the user needs are validated through the system validation process. So, how to do that? So let's discuss traceability in the testing process. Traceability is a tool to help determine the system correct needs and other attributes, such as completeness, accuracy, consistency, and testability. Traceability needs to be achieved at many levels throughout the system lifecycle. In short, a system that's been acquired or developed must be testable or verifiable. That's why traceability plays such a big role in testing. We will discuss more on traceability later in this module. It is worth noting that the system lifecycle identifies the important steps for system development and the implementation, including testing. However, it does not clearly address during which process the test plan and test documents should be prepared. I'd like you to think about this for a moment, then we'll get back to this topic shortly.

Joey Yang: In the previous slide, we discussed that the system needs to be verified against the requirement and the user need. What's the relationship between verification and testing? According to IEEE-829, verification methods include inspection, demonstration, analysis and testing. Testing is the most important verification method. Testing is to verify that requirements are met by testing under controlled exercise using real or simulated stimulus. For the NTCIP standards testing described in this module, we will focus on testing.

Joey Yang: Now, let's discuss briefly about the testing process. The testing process consists of multiple test activities and tasks, and is a very complex process. IEEE 829 placed enough emphasis on the testing process and how it correlates to the system lifecycle. Testing does not begin only at the end of the system development. Instead, it should be carried out throughout the entire system development cycle, from concept exploration all the way to the end of the system life. The V model explains the verification part of the testing process. However, the V model does not provide a complete view of how the testing process is related to the system lifecycle. In this module, we will be discussing the process of how to plan the test, how to prepare the test documentation, and how to execute the test.

Joey Yang: The testing process consists of multiple test activities and tasks. If we group the test activities into three major steps, such as test planning, test documentation preparation, and test execution and reporting-- so that's the three steps. The first step, test planning, is to write test plans. Test documentation preparation, in step two, is to perform test designs and develop test documents, including test cases and test procedures. Lastly, test execution is to perform the actual test and report test results after the tests are completed.

Joey Yang: Let's take a look at how these three steps are related to the system lifecycle. Test planning occurs at the beginning of the system lifecycle when the user's needs and requirements have been developed. Writing test plans usually takes place in this stage. Test documentation preparation, such as performing test design and developing test cases and test procedures, occurs during the system design phase. As we talked about earlier, there's a difference between test plans and test documentation. Test plans are developed at the beginning of the testing process for test planning purpose, and typically describe the technical and management approach to be followed for testing a system. Test documentation, including test cases and test procedures, is prepared while test design is being performed, while it typically occurs after its associated test plan is developed. If ITS device is acquired, the test plan and test documentation should be developed prior to the delivery of the device. The test plans and test documents can then be used for acceptance testing at the delivery of the device and after the field installation. Test execution and reporting is conducted at the completion of system development, or after the ITS device is delivered. The testing takes place at multiple levels, as discussed earlier. In addition to the unit testing, it also includes the testing performed at the subsystem and the system levels, and during operations maintenance and system upgrades. During this process, revisions to the test plan and the test documentation prepared in earlier stages may be required. Please also note that test activities are most effective when conducted in parallel with the system development process, not just at the completion of the development. This is often overlooked by the agencies. Test plan and test documentation are oftentimes not been prepared until the system development is complete. So later in this module, we'll continue elaborating on the testing process and we'll have focused discussions on how to develop test plans and prepare test documentation for testing a CCTV system.

Joey Yang: Now, we'll start with our first questions for learning objective one.

Joey Yang: Which of the following statements is not correct? So we have four possible choices. A: Requirements can be verified by inspection, demonstration, analysis, and testing of the system products. B: The testing process provides an objective assessment of system products throughout the system lifecycle. C: Test documentation needs to be prepared only at the completion of system development. D: Development of test plans can begin as soon as the system ConOps is being developed. Let's review the answers together.

Joey Yang: So, answer A is incorrect. The statement is true. Requirements can be certainly verified by inspection, demonstration, analysis and testing of the system products. Answer B is also incorrect because the statement is also true. The testing process provides an objective assessment of the system products throughout the system lifecycle.

Joey Yang: Answer C is the correct answer. This statement is not correct. The test documentation is typically prepared at the system design stage and not after the system development is complete. Answer D is incorrect. This statement is true. Development of test plans can begin as soon as the system ConOps is being developed. It was noting here that the development of test plans may begin early in the system lifecycle, but they cannot be finalized until the requirements are fully developed. As we discussed earlier, the test plans oftentimes need to be revisited even after the system is completely built, so every time you upgrade the system you'll want to revisit the test plans and the system requirements, and a retest may be needed.

Joey Yang: So this will conclude our first learning objective. So as a summary, we discussed the purpose of testing a CCTV system. We reviewed the concept of system lifecycle and testing to be undertaken. We reviewed four verification methods. We also discussed the testing process in relation to the system lifecycle.

Joey Yang: Now, we move on to the second learning objective. We will focus how to develop test plans in our second learning objective. We will first discuss the purpose of test plans and then review the test plan defined in IEEE 829 standard. The 2008 version of IEEE 829 defines two levels of test plans: Master Test Plan and Level Test Plans. We will introduce their concepts and go over the difference between them, and when and how to use them. At the end of this learning objective, we will work together on a case study to discuss what to be included in a test plan in order to test a CCTV system.

Joey Yang: So let's discuss the purpose of test plans. Test plans are generally used as overall test planning and management document, which identifies test activities and associated efforts. It also sets objectives for each test activity and identifies the risks, resources, and schedule for each test activity. The test plans are also used to determine the requirement for test documentation, what test documents will be required, such as test design, test procedures, test log and report, etcetera.

Joey Yang: So, what is a test plan? Test plan is a management-level document. It defines the scope, technical and management approach, resources, and the schedule of the intended test activities that will take place in the testing process. It identifies test items, features to be tested, testing tasks, who will perform each task and when. It also identifies if there's any risks which will require contingency planning. IEEE 829 introduces the concept of Master Test Plan and Level Test Plan. We will discuss these two concepts in the next slides. We need to note here test plan was not covered by any NTCIP standards. NTCIP simply references IEEE 829 standard for developing test plans, while NTCIP describes in detail about how to prepare the test documentation specifically for the NTCIP environment, which results in customization to the principles defined in IEEE 829. Although NTCIP takes a similar scope to IEEE 829, NTCIP standard defines the test documentation in detail in NTCIP 8007 and other device standards such as 1203 for DMS and 1204 for ESS. However, NTCIP 1205 for CCTV camera control does not include the test documentation in a current available version.

Joey Yang: Let's take a look at how the test plans can be constructed for a project. As shown on the top of this diagram, a master test plan is an overall test planning document that consists of three test levels, including unit test plan, subsystem integration test plan, and system acceptance test plan. For each test level, a separate test plan is generally required because each level requires different resources, different methods of testing, and different testing environments. We also need to note here having a master test plan is not a requirement by IEEE 829. It's not necessary that every project must have a master test plan. The agency who manages the testing will dictate if a master test plan is necessary based on the test activities required for the testing. However, IEEE 829 recommends the test process be documented in a highest-level test plan if there is no master test plan developed.

Joey Yang: This slide shows an example of an ITS master test plan and multiple level test plans that include subcomponents of the ITS system. The overall ITS includes multiple subsystems, such as CCTV, dynamic message signs, and transportation sensor systems. The level testing will include unit testing and subsystem testing for each system component and subsystem. It will also include the system acceptance testing for the overall system with all the system components included.

Joey Yang: Here is another example of CCTV testing that includes a master test plan and multiple-level test plans for a CCTV system. In this case, only CCTV system is tested at different levels, such as factory test, field standalone test, system integration test, system operation test, and system acceptance test. The factory test and the field standalone test are the testing at the unit level, and only limited to system interface that will be tested in a simulated environment. System integration testing is to test the entire CCTV system in a real environment, including the actual field conditions and central system. A system operational test may also be needed for testing the CCTV system in an extended period, such as 20 or 30 days prior to final system acceptance testing. So this is very commonly done by the agencies. The system acceptance test is to focus on the fitness for use and validate the system is implemented per the user's needs. Testing for NTCIP 1205 conformance should be included in the testing levels as determined by the agency during the test planning. Typically at a minimum the NTCIP testing is included in the factory test, field standalone or bench testing prior to installation, and the system integration testing as well. For system operations and acceptance testing, it is at the agency's discretion if conducting a full NTCIP testing is necessary.

Joey Yang: This slide is to show an outline of a master test plan in accordance with IEEE 829. A master test plan normally consists of three sections: introduction of the overall test program, details of test processes, and general information at the very end. I will not go over the details here; you may refer to IEEE 829 standards for details. What I'd like to point out here is about the test documentation requirement. If a master test plan is developed, the master test plan needs to list the requirement for test documentation. For NTCIP 1205 testing, requirement for test document may include requirements, test case traceability matrix, test cases, and test procedures. So if a master test plan is required, then those test document requirements need to be documented in the master plan.

Joey Yang: This slide shows an outline of a level test plan. A level test plan may include four sections: introduction of the test, details of the test specific to this level of testing, test management, and general information. In addition to the test scope, a level test plan should include the test level in the overall sequence. A level test plan should also include test classes that are designated groups of test cases, and test conditions for each test class. The test conditions may include positive and negative testing and what boundary values to be used during boundary testing. The main section of a level test plan should include the test items, features to be tested or features not to be tested in this test level. The approach for testing, pass/fail, suspension criteria, and test deliverables should be documented in the level test plan as well. A test traceability matrix may be also included in the level test plan. For testing NTCIP 1205, this can be the requirement test case traceability matrix that is used to further develop test cases and the test procedures during the test design.

Joey Yang: In comparison to the master test plan, the level test plan also introduces more details on the test management and identifies planned activities, tasks, resources, schedules, cost, responsibilities and risks, and so on.

Joey Yang: Normally, QC/QA procedures, metrics for specific measures and percentage of the requirements tested are also included in a test plan for the particular level. For additional information on the test plans in IEEE 829 standards, you may refer to the student supplement.

Joey Yang: Let's work on a case study together on how to put a level test plan together.

Joey Yang: Let's continue with the example we discussed earlier and walk through the process of how to develop a CCTV unit test plan. In this case study, we only selected a few main items listed in outlines of the level test plan. You may refer to the student supplement for more details. These main items include the test level in the overall sequence, test classes and overall test conditions, test items, test traceability matrix and features to be tested and not to be tested. Let's start with the test level as shown in the previous example. CCTV unit testing is only one small component among the overall ITS test plans. The level of testing and its order in overall test sequence will need to be defined clearly in the CCTV unit test plan. This is best supported by a diagram. Test cases and overall test condition should include the unique nature of this particular level of testing, such as pan-tilt zoom functions, presets, focus, iris settings, alarms and zones. Detailed descriptions should be provided how these features will be tested in groups and their associated test conditions for each group. The test conditions include positive testing, negative testing, and boundary testing.

Joey Yang: The test items are the object of testing. For example, CCTV camera model, make, firmware version, user manual, etcetera. Also, identify any procedure for their transfer from other environments to the test environment. For example, from factory to test lab, or agency facility if the test will be performed by the agency. Test traceability matrix is a list of requirements and corresponding test cases and procedures. A detail test traceability matrix is normally developed during a test design, but a high-level traceability matrix can be developed in the level test plan.

Joey Yang: The previous CCTV training module, A317b, demonstrated the process of how to develop project-specific requirements for CCTV camera control. During this process, a requirement traceability matrix, also called RTM, is developed to include project requirements and associated NTCIP object and dialogs. The CCTV features to be tested will consist of this NTCIP object that has been identified in the RTM. However, not all these objects will need to be tested for unit testing. For example, TMC remote control features may not be applicable for the unit-level testing. Therefore, the remote testing features may be included in another level test plan, such as CCTV subsystem test plan. More on the CCTV test features will be discussed in the next slide. The test approach needs to describe the overall approach for the level of testing, features to be tested, features not to be tested, and approaches are commonly combined in a table called Test Matrix. The test matrix can be combined with the test traceability matrix, as mentioned earlier, just to simplify the effort for developing the test plan. Some examples of possible test measures also need to be documented, such as black box, white box, analysis, and inspection measures. So for the purpose of NTCIP testing, the black box test method is normally used.

Joey Yang: The following three slides show example RTMs from the previous A317b training module. Let's look at the first RTM related to CCTV configuration. On the left side of the RTM, it shows some of the project requirements for configuring a CCTV camera, and on the right side of the table it shows NTCIP 1205 dialogs and objects that are associated with each requirement. The features to be tested for CCTV unit testing will be NTCIP object shown on this RTM. This object will need to be included in the unit test plan, typically included in the test matrix mentioned earlier.

Joey Yang: Here is one more example of an RTM for CCTV camera control. Project requirement list, again, on the left side, and NTCIP object listed on the right side. This object needed to be identified and included in the test matrix as well.

Joey Yang: Here it shows the requirement and NTCIP object for CCTV monitoring. So we need to note here these are only examples and do not include all NTCIP objects that are required for the project. So all required object included in the project RTM will need to be included in the test plans. So, what's shown here are only examples of some of the objects.

Joey Yang: So now let's review the question together for this learning object.

Joey Yang: Which of the following is included in a Level Test Plan but not in a Master Test Plan? A: Test scope. B: Test processes. C: Test resources and responsibilities. D: Test traceability matrix. Let's review the answers together.

Joey Yang: Answer D is the correct answer. Test traceability matrix is only included in the level test plan but not in the master test plan. Answer A is not correct. Test scope is actually included in both LTP and MTP. Answer B is not correct either. Test processes are only included in the master test plan. Answer C is not correct. Test resources and responsibilities are actually included in both the level test plan and the master test plan.

Joey Yang: Here we finish the second learning objective. So, as a summary, we discussed the definition of test plan in accordance with IEEE 829. We also identified the difference between the master test plan and the level test plan, and when and how to use them. And we also discussed the structure and the content of the master test plan and the level test plan in details.

Joey Yang: So now let's move on to the next learning objective. We have finished the discussion on test plans. Now we will review the test documents: what test documents are normally required, what contents are included in these test documents, and how these test documents are developed. We will first review the test documentation defined in the IEEE 829 standards. Then we'll help you to understand the difference between test plan and the test documentation. At the end, we will discuss the process of designing a test and the relationship between test plans, test cases, and test procedures.

Joey Yang: According to IEEE 829, the requirements for testing should be included-- for test reporting should be included in a master test plan for the entire test program if a master test plan is developed. Well, a detailed list of test deliverables should be included in each level test plans. The list of test deliverables is shown on the slides. It's including from test plans, test designs, test cases, procedures, and all the way to test log, test report, and so on. So just to point out, a test should not begin until all the test documents are prepared. The test plans, test design, test case and test procedures should be well developed and approved for use prior to the test. And the rest of the test documents are to serve the purpose for test reporting during and after the test.

Joey Yang: This slide shows the test documentation required prior to conducting the test. In this diagram, three levels of tests are used. They are the unit test, subsystem integration test, and the system acceptance test. This diagram only expands the unit test plan to include the test documents associated with the unit testing. The successive documents also apply to all level of test. So just for simplicity, we're not showing that on this slide. The main step is to develop the test design, which is a document to specify the details of the test approach for features to be tested and identifying the associated test, commonly by including the organization of the test into groups. Test cases and test procedures are generally developed during the test design process.

Joey Yang: This slide shows the test documentation required during and after test execution. This diagram still uses the three levels of test as used in the last slides. Predecessor documents such as unit interim tests and status report, unit test logs, and anomaly report also applies to all level of report, all level of testing. In this case, the other levels of test are subsystem integration test report and system acceptance test report. Depending on the test duration, interim test status report may not be required, and it is at the test agency's discretion if this interim report is applicable or not. The anomaly reports are often used as a handy tool for the test agency to provide active feedback to the device manufacturer for software problem encountered during the test. The master test report will be the test report for the entire test program, typically when a master test plan is developed. Depending on the needs and the purpose of the testing, perhaps one report is just enough. This diagram is for illustration purpose and is not necessary for the test agencies to provide all report as shown here. So it's totally up to the test agencies what kind of test documentation, especially for the test report, needs to be finished at the end of the testing.

Joey Yang: Let's discuss the difference between test plans and test documentation. As shown in the last two slides, test plans are management-level documents and are to specify test documents required for testing. Test plans should be developed prior to developing test documents. Test documents, including all information required for preparing test activities, and the document's all results for each test activity. This includes test cases, test procedures, test reports, etcetera; test input and output data; and test tools are sometimes also included in the test documentation.

Joey Yang: IEEE 829 defines a test design as a test document that specifies the details of the test approach; identifies the features to be tested by the design as well as identifies the associated tests, including test cases and test procedures. Note that requirement test case traceability matrix needs to be further developed in this process. The purpose of this matrix is to trace test cases to the project requirement. When we discuss the level test plans, a test traceability matrix is typically used in the level test plan, but it is only a high-level test matrix and should be further developed during test design to ensure all requirements are included in the test cases.

Joey Yang: This slide shows the format of the requirements test case traceability matrix that is defined in NTCIP 8007, and are commonly used in the NTCIP standards. The left side of the matrix should list all the project requirements that can be traced to one or more test cases, which are listed on the right side of this matrix. Each requirement and test case is given a unique ID number. The requirements can be obtained from the project specifications or a product requirement list. This slide shows some of the data exchange requirements in an example PRL used in the A317b training module. For the CCTV testing, the NTCIP 1205 standard does not include a PRL because NTCIP 1205 was developed without going through the system engineering process. The requirement list will have to be prepared by the agency when CCTV specifications are developed. In the previous module, the training module A317b, which describes the process of how to develop requirements for CCTV camera control-- you may want to review this training module if you are not familiar with the requirement development process. Determining the number of test cases, what features to be included in each test case, test measures, and pass/fail criteria should be included in the test design.

Joey Yang: This slide shows the relationship between test plan, test design, test case, and test procedures. So we will continue to use the unit test as example here. For each level of test plan, it directly relates to the following test documents: one test design document, and one or multiple test cases. Then it also correlates to one or multiple test procedures.

Joey Yang: So as a summary, only one test design for each test plan. One test design may be associated with multiple test cases. For each test case, it relates to one or multiple sets of test procedures. For each set of test procedure, it may link to multiple test cases. For simple devices such as CCTV cameras, NTCIP combines test case and test procedure into one document. For complex devices, a better option may be to separate test cases and test procedures as suggested by IEEE 829. Whether they are separate or combined, the test plan will need to clearly identify the test documentation requirement based on the complexity of the test.

Joey Yang: Now let's go over one question for this learning objective.

Joey Yang: Which of the following is part of test documentation? A: Test data. B: Test plans. C: Requirement test case traceability matrix. D: All of the above.

Joey Yang: Let's review the answers together. So D, all the three above answers are correct. For answer A, it's incorrect, so the test data is actually included in the test deliverables, so it's a part of the test documentation. And answer B, test plans, are also included in the test deliverables. So it's part of the test documentation. Answer C, requirement test case traceability matrix, is included in both level test plan and the test design, so it's definitely part of the test documentation. So all the three items are part of the test documentation. So D is the correct answer.

Joey Yang: So as summary as we finish learning objective three, we reviewed test documentation. We discussed the difference between test plans and test documentation. And we also reviewed the test design and the relationship between the test plan, test design, test cases, and test procedures.

Joey Yang: Since we have discussed the process of how to develop the test plan and test documents, let's talk about how to apply that to a CCTV system based on NTCIP 1205. So, as part of this learning object, we will help you understand the basis of a CCTV system and its test environment. And we will work together to identify key elements of the NTCIP 1205 standards that are relevant to the testing. And finally, we will develop sample test documents, including test design, test cases, and test procedures.

Joey Yang: Let's look at the CCTV field hardware, and it typically consists of the following: camera and its enclosure; its lens assembly to perform the focus and iris function. And it also contains pan/tilt assembly. It also has camera control receiver. This could be a separate unit or integrated with the camera. Then here it has the equipment cabinet that houses the communication device, which could be Ethernet switch or could be a serial server, and other accessories such as camera, power supply, wiper, heater, washer, blower, and environmental sensors, etcetera. Note here NTCIP 1205 only applies to the camera control, which does not include video display function and video format. So the standards only focus on data communications and the interface.

Joey Yang: Now, let's review the CCTV camera test environment using the test unit as an example. For NTCIP testing, the test CCTV camera is required to be connected to a test software, which is typically installed on a management station. A data analyzer may be used to capture the data exchange between the camera and the management station. Simulated inputs may be used if the camera itself is not connected to an equipment cabinet. For example, the camera door alarm. Whether a camera is required or not for testing depends on the product requirement and the test plan developed by the agency. Video output will need to be monitored on a video monitor so that loss of video image can be verified as well. The communication network can be serial or Ethernet network depending on the requirements of the project.

Joey Yang: Once we have the test environment established, let's take a look at what are the key elements of NTCIP 1205 that are relevant to the testing. NTCIP 1205 includes two main sections: objective definition, and conformance groups. The management information base consists of objects that are used for controlling and managing the CCTV features include, for example, range, timeout, preset, position, system features, alarm, input, output, and so on. Four conformance groups are defined in NTCIP 1205, including CCTV configuration, extended functions, motion control, and on-screen menu control. Depending on the type of camera required by the project, certain objects and conformance groups may not be applicable for the testing. For example, if the CCTV is a fixed camera, then motion control is not applicable here. If camera is connected to an Ethernet switch directly without having a control receiver housed in a cabinet, then camera alarm object is not applicable for the testing. The features to be included in the test plans and the test document will need to be specified and the requirement test case traceability matrix should be developed accordingly such that these features are traceable between requirements and associated test cases.

Joey Yang: If we look at NTCIP 1205, we can see NTCIP 1205 does not include any information related to testing. So you may ask that what are not included in NTCIP 1205 but required for developing test documents. So this list includes the items that are not included in NTCIP 1205 but will be required for developing the test documents. As discussed earlier, the testing is to satisfy the user need and meet the system requirement. User needs and requirements will need to be identified prior to the testing. NTCIP dialogs and the data exchange sequences are not included in NTCIP 1205, but the generic dialogs included in other NTCIP standards can be used for testing NTCIP 1205. Project requirements list, PRL, is a list of requirements that are traceable to the user needs. Requirement traceability matrix, RTM, is a matrix to specify the traceability between requirements and NTCIP dialogs and object. In the two previous CCTV training modules, A317a and A317b, describe the approach on how to develop user needs, requirements PRL and RTM for NTCIP 1205, and how to incorporate them into the agency's specifications. And you may want to refer to these modules for details on how to develop those documents. This module will only focus on the discussions on developing requirement test case traceability matrix, RTCTM, test cases, and test procedures, which are essentially required for NTCIP testing. Please note the difference between an RTM and an RTCTM. RTM is to specify the traceability between requirements and NTCIP dialogs and objects, which RTCTM specifies the traceability between requirements and test cases.

Joey Yang: So, how to develop a requirements test case traceability matrix. As discussed earlier, this matrix is developed as part of the test design. The main purpose of test design is to identify the features to be tested by a particular test-- for example, the unit test, as we discussed earlier. The features to be tested are included in the RTCTM, and the RTCTM needs to be developed based on an RTM developed for the project. We will take the CCTV unit test as an example to walk through the process of how to develop an RTCTM based on a requirement traceability matrix for the project.

Joey Yang: Let's take a look at an example RTM from the A317b module and discuss the process of how to develop a requirements test case traceability matrix. The RTM table is to show how to trace project requirements to NTCIP dialogs and objects. This example is the same one when we discussed earlier about how to develop the CCTV unit test plans. This table shows the requirements and the related objects for camera control, which include present go-to position, go-to store position, zoom operation, and camera pan function. We will further develop the test documents based upon this RTM.

Joey Yang: Now let's walk through the process of how to create an RTCTM for camera control. The RTCTM includes two parts. The left side of the matrix lists all the requirements to be tested as included in the RTM, and the right side of the matrix lists the test cases that are determined as part of the test design. Each test case is assigned with a unique ID. The requirement and this ID must match the description and ID in the RTM. In this example, one test case, C3.01, is determined for testing two requirements: preset go-to position, and move camera to a stored position. So in this case, we can create one test case to test two requirements. And if we look at the bottom of the table, four test cases, C3.05 through C3.08, are designed to test the zoom requirement. So in this case, we need to develop four test cases to test only one product requirement.

Joey Yang: Here is a continuation of the table in the last slide. Four test cases are designed to test the pan function of the camera. Basically the test design process determines how many test cases will be needed to thoroughly test one requirement. So either it could be one test case or it could be multiple test cases. The goal is to test every requirement that is required for this level of testing as defined in the unit test plan.

Joey Yang: So, what are included in a test case? So let's continue using test case C3.01 as an example. The test case needs to include its unique ID, its own title-- in this case, it's, for example, Preset Position-- and needs to include test case description, input and output variables, and also the pass and fail criteria.

Joey Yang: This slide shows what the test case looks like in accordance with NTCIP format as defined in NTCIP 8007. Note that the variables and its input value are shown in the test case documents. In this example, the input values are either from the project requirements or the test plans which has been developed early on in the testing process. Pass/fail criteria is also identified herein. In this example, every verification step included within the test case needs to be passed in order to pass the test case.

Joey Yang: After the test case is developed, the next step is to develop test procedure. So let's continue to use C3.01 as an example. The test procedure includes every step and associated detailed instructions to execute the test case. Input and output values and special requirements are also included in each step. The result of each step will need to be verified against the predefined value or behavior. Pass and fail need to be documented after each step. Note that NTCIP typically combines test case and test procedure into one single document, so the ID number of the test case may be the same number as the test procedure. NTCIP 8007 defines a set of keywords that can be used in the test procedure. Keywords will be discussed as we walk through the example in the next few slides.

Joey Yang: So, let's walk through the process of how to develop a test procedure in accordance with NTCIP 8007. First, let's go over some of the keywords used in this example. All the keywords are highlighted in red on this slide. So, typically config and setup are used at the beginning of the test procedures. Exit is used when terminating the test case is necessary. Get is used to retrieve a value from an object, and Set is used to set a value to an object. Verify is used to verify a response value against a predefined value, normally defined in a test plan or project requirement. So you may want to refer to NTCIP 8007 for complete list of keywords. So what's mentioned above are the keywords used in this example. Again, in this example, we will verify if the test camera can go to a preset position, store a preset position, and move the camera to the stored position. This example is for testing two or more presets. So when only one preset is used, the test procedure will need to be modified accordingly. So let's go over the step-by-step. So, first of all, from step 1 to step 1, we will verify the preset function is supported by the camera. If it does, then we will verify if the camera supports more than two preset positions. If the conditions are met, then we will continue with the actual test. If not, the test case is terminated, or suspended.

Joey Yang: So here is a continuation from the previous slide. Step 5 is to continue configuring the test after verifying the camera supports more than two preset functions. Step 6 is to ask the camera to move to a predefined pan-and-tilt position. Step 7 is to verify if the camera actually moved to position one. Step 8 is to ask the camera to store position as preset one.

Joey Yang: Step 9 through 11 is to store another position as preset two. Step 12 is to ask the camera to move to preset one. Step 13 is to verify the camera moved from position two to position one. Step 14 is to retrieve the preset position value. And Step 15 is to verify the preset position the camera responded is actually preset position one. Step 16 is to ask the camera to move to preset two.

Joey Yang: Step 17 is to verify the camera actually moved to preset two. In Step 18-- so the value of the preset position is retrieved from the camera. Then at the last step, step 19, is to verify the preset position the camera responded is actually preset two. So the bottom portion of the test procedure is used to document the overall test results and any notes from test observations. If all steps are completed and passed, then the test case is passed. As you can see here, developing test cases and test procedures is very time-consuming. It requires a lot of effort and resources to perform the test and go through every test case and every step in each test procedure. So you may want to make sure that adequate time and resources are allocated for testing.

Joey Yang: Let's review a question together for this learning objective.

Joey Yang: Which is a test document included in NTCIP 1205? A: Protocol requirements list. B: Requirements traceability matrix. C: Requirements test case traceability matrix. D: None of the above.

Joey Yang: Let's review the answers together. Apparently D is the correct answer, none of the above are the correct answers. Answer A, protocol requirements list, is not a test document and is not included in NTCIP 1205. Answer B, requirements traceability matrix, is not included in NTCIP 1205 and is not a test document either. Answer C, requirements test case traceability matrix, is a test document but not included in NTCIP 1205.

Joey Yang: So, as a summary of learning objective four, we reviewed the basis of a CCTV system and its test environment. We identified key elements of the NTCIP 1205 standard that are relevant to the testing. We also developed a sample test document, including test design, test cases, and test procedures.

Joey Yang: Let's move on to our last learning objective for this webinar. We'll describe test tools and test conditions for NTCIP 1205. We will first review the test tools and the equipment that are normally used for NTCIP testing, and then we will discuss how to address the consequences of positive testing, negative testing, and boundary testing. And at last, we will discuss the complexity of NTCIP testing.

Joey Yang: So first, so let's review the test tools and equipment that are normally used for NTCIP testing. Let's first take a look at the test environment by using the CCTV unit testing as an example. Then, we'll discuss the minimum requirements for test tools and equipment. And at the end, we'll review different types of NTCIP test tools.

Joey Yang: NTCIP test environment typically consists of one or more devices under test. The devices under test could be controllers, cameras, dynamic message signs, etcetera. A certified test software that can be used on the management station prior to conduct the testing-- so please note the keyword there is certified and preapproved test software, so that can be used. Data analyzer is often required in addition to the test software. The data analyzer is used to capture the data exchange between the device on the test and the management station and normally for in-depth analysis when anomalies occur. Most of the time the communication network could be a simple test network based on the project requirement, either Ethernet, serial, or both; sometimes wireless-- network is required if the device is required to operate over a wireless network. Note that the test environment should be designed to minimize any complicating factors that may result in anomalies unrelated to the test case. Failure to isolate such variables in the test environment may result in false result to the test. For example, the device may be conformant with the standard, but communication delays could result in timeouts and be misinterpreted as failures.

Joey Yang: Let's use the CCTV unit test as an example and look at the same diagram as we have seen in an earlier slide. The CCTV camera is the device under test and is required to be connected to a test software installed on a management station. A data analyzer may be used to capture the data exchange between the camera and the management station. Simulated inputs, such as cabinet door alarm, may be used if camera itself is not connected to an equipment cabinet. Video output will need to be monitored on a video monitor so that loss of video image can be verified.

Joey Yang: So what are the minimum requirements for test tools? So, let's go over some basic requirements. The test tools must be capable of performing tests for conformance to specific NTCIP information level standards, such as 1205. And the test tools must support for communication testing, such as SNMP for CCTV cameras. And it must support script features to support automated testing, and it also needs to support multiple protocols, such as point-to-point, point-to-multipoint, or TCP/IP, etcetera. And it also needs to support a wide variety of media, including Ethernet or serial, and others. Please note here that supporting automated testing is a must-have feature of test tools. Testing is a very time-consuming process and automation can really speed up the testing process.

Joey Yang: There are two types of test tools: passive and active test tools. The passive test tools are used as data analyzer in a typical test setup. Passive test tools can capture live data on a real-time basis, but can only be used to monitor the data exchange and cannot respond to an ITS device stimulus. Some well-known passive test tools include Serialtest, Ethereal, and other protocol analyzers, including TCP/IP, UDP, etcetera.

Joey Yang: The active test tools are used as main test software for NTCIP and provide a means to send messages to the device under test and await response. The active software can be used for most of the needs of NTCIP testing. However, there are some limitations for the active test tools. For example, they do not support all the objects in NTCIP, such as proprietary logical blocks. They may not support sophisticated communication testing-- for example, the communication load testing. The agency needs to be aware of these limitations and special-purpose software needs to be developed to perform these additional tests. If required, these additional software tools will need to be ready prior to the testing.

Joey Yang: Here are some examples of active NTCIP test software that are commonly used for NTCIP testing, such as DeviceTester, NTCIP Exerciser, Ntester, and SimpleTester. And you may go to their website if you need to learn more about their product.

Joey Yang: Let's take a few minutes to discuss how to address the consequences of positive and negative testing. Positive testing is to use valid input values to test the DUT's behavior in a positive way. The DUT should process the value and respond successfully. Most of the testing should focus on the positive testing with data sampling within a valid range of values. Negative testing is to use invalid input values, dialogs, or data exchange sequence to test the device in a negative way. At a minimum, the device under test should not process the request and should remain in operation and continue responding to the test software requests without communications lockup. So preferably, the device under test should provide appropriate error processing, such as responding with an error message. For example, if the test is to move the test camera to an invalid zone, it's to be expected that the camera ruses to be moved and responded with an error message indicating the command is invalid. Negative testing is typically required for critical functions that should be determined by the agency who develops the test plan and the test document prior to the testing.

Joey Yang: The boundary values include the input values which are just below or just above or just on each limit. Boundary conditions consist of testing the device in both positive and negative ways. All boundary values need to be tested. If the boundary value is valid, the device under test should process successfully and respond in the same way as the positive testing. If the boundary value is invalid, the device should not process the request and remain in normal operation, respond with proper error messages, and most importantly, the test unit should not hang up on the communications. It has been seen in many NTCIP testing that the test units lock up on the communication channel and require a hard reboot of the test unit, which in that case-- so it cause a failure, so it should not pass the particular test.

Joey Yang: We are close to the end of this webinar. Before we finish, let's just discuss briefly about the complexity of NTCIP testing. In general, testing is a very complex process, from the planning to test execution. The test process spans across all the system lifecycle. Agencies will need to make sure adequate time is allocated, not only for the testing but also for test planning purpose. All NTCIP objects required by the project should be tested, and all boundary conditions should also be tested. If time permits, selectively test error conditions for critical functions. So this is also very important, so make sure there's some time allocated for the negative testing.

Joey Yang: Oftentimes software codes are required to be modified or corrected due to anomaly found during the test. The test agency should determine when the test needs to be suspended, resumed, or terminated. When the new software is released, the new release can be tested for progression test, which is to test the new and corrected features. At most times, regression test is also required to perform testing on the features and functions that has been tested in the past in order to ensure that nothing else has been changed other than the corrected features. The test agency will need to determine the extent of tests that must be repeated when changed are made to the software to assess the nature of the change to determine potential ripple effect and the impacts on other aspects of the system, and that we run test cases based on changes and potential impact by the software modifications. At a minimum, regression tests should be done for all the software affected by the test failure. It's worth noting that it's extremely time-consuming for regression testing, and it is a judgment call for the agency to decide how much of the regression testing is acceptable based on the test schedule and resources.

Joey Yang: Now, let's review a question for this learning objective.

Joey Yang: Which of the following statements is correct? A: Data analyzer is an active test tool and can be used to respond to the DUT's request. B: All possible permutations and combinations of valid input values need to be tested. C: Performing boundary analysis is not necessary during NTCIP testing. D: None of the above.

Joey Yang: Let's go over the answer together. Answer A is not correct. Data analyzer is a passive test tool and can only be used to monitor the data exchanged between two components. So data analyzer is not an active test tool. Answer B is also incorrect. It is nearly impossible to test all possible permutations and combinations of valid input values. Instead, testing samples within the required range should produce acceptable test results.

Joey Yang: Answer C is incorrect as well. Performing boundary analysis with positive and negative range is necessary to verify the device under test's response to all required dialogs and objects. Therefore, answer D is the correct answer. None of the above are correct answers.

Joey Yang: As a summary of learning objective five, we reviewed the test tools and equipment. We discussed the consequences of positive and negative testing. We discussed the consequences of testing boundary conditions. And we also discussed the complexity of NTCIP testing with a focus on the regression testing.

Joey Yang: So we finished all the five learning objectives. Let's take a look at what we have learned today. The testing process determines whether the system conforms to the requirements and whether it satisfies its intended use and user needs. Requirements can be verified by inspection, demonstration, analysis, and testing of the system products. The testing process provides an objective assessment of system products throughout the system lifecycle. A test plan is a document that describes the scope, the approach, resources, and schedule of intended test activities. The test plan may be a Master Test Plan or a Level Test Plan.

Joey Yang: A list of test documents delivered at the completion of the test is included in Level Test Plans. The details of Requirements Test Case Traceability Matrix are developed as part of test design. Test cases define test input and output values. Keywords used in test procedures are defined in NTCIP 8007. NTCIP test tools include passive and active test tools.

Joey Yang: Here are some additional resources you may want to look into for additional information.

Joey Yang: You can also get the training materials at the DOT website, as shown on this slide.

Joey Yang: So now, we are at the end of the webinar, so let's take a few minutes to review some questions that are frequently asked in the past. So what I have here are four questions. The first question: A CCTV camera is not a complex device. Why does the testing process in this training module appear to be very complex? This is certainly a good question there. The approach included in this training module follows NTCIP 8007 and IEEE 829 software testing standards. This approach applies to all ITS devices, so not just CCTV camera. Because the CCTV camera is less complex than other ITS devices, such as ATC traffic controllers or the DMS signs, the efforts and the measures for developing the test plans can be simplified. It is at the agency's discretion to choose a simplified approach for CCTV testing based on their needs. Please note that not all materials included in the training module applies to every case, and the person who is to develop CCTV test plans needs to understand the purpose and the need and the project, and in particular the project requirements, prior to developing test plans. The second question is: Which public agencies currently require NTCIP 1205 conformance? Actually, not many public agencies currently require their CCTV systems conform to NTCIP 1205. To my knowledge, New York State DOT, Virginia DOT, Florida DOT and Texas DOT tested their CCTV cameras for NTCIP conformance. The third question: Where can I find some example test cases and test procedures? Some NTCIP standards, such as 1203 for dynamic message signs, includes test cases and test procedures that can be used as examples for other NTCIP standards testing. NTCIP 8007 provided a good framework, and so has this training module. The agency mentioned earlier could be the resources as well. The companies which provide the NTCIP test software could also be the starting point to find example documentation. But in general, developing test plans and test documents is a time-consuming process and even with the examples, the test cases and test procedures will still need to be modified and tailored to your project needs and requirements. So never underestimate the effort. The last question I have here: Can you talk who is best suited to write the test plan and why? The system engineering consultant, independent consultant, the contractor, or the public agency. So this is really an excellent question there. First of all, the public agency-- in other word, the owner of the system device-- is ultimately responsible for developing the test plans. As far as who is best suited to write the test plans, the answer could be the people who have the relevant experience, skill set, and qualifications for writing the test plans. The test plan developer will also need to have a good understanding of the technical requirements of the system and the devices. It's certainly a good option for the agency to develop the test plans if the agency has the technical capacity. However, if this is not an option, then the agency will need to find someone to develop the test plans and even perform the testing for them. As far as who will be better suited between the contractor and an independent consultant, having an independent consultant will help establish independent process while the test plans are being developed, so to eliminate the bias from the device provider. The agency will definitely benefit by seeking out an independent consultant from that perspective. Given that, this will conclude our webinar today.

### End of 2013_12_04_14.16_T317_Web_Recording.wmv ###