Voting System Standards

FEC HOME > AGENDAS > 12/13/2001 AGENDA > AGENDA DOCUMENT 01-62

This document is part of Agenda Document Number 01-62 on the agenda for consideration at the December 13, 2001, meeting of the Federal Election Commission.


Volume II, Appendix A

Table of Contents

A Appendix A: Qualification Test Plan.................................................

A.1 Introduction.............................................................................................

A.1.1 References..........................................................................................

A.1.2 Terms and Abbreviations..................................................................

A.2 Prequalification Tests............................................................................

A.2.1 Prequalification Test Activity.............................................................

A.2.2 Prequalification Test Results.............................................................

A.3 Materials Required for Testing.............................................................

A.3.1 Software..............................................................................................

A.3.2 Equipment...........................................................................................

A.3.3 Test Materials.....................................................................................

A.3.4 Deliverable Materials.........................................................................

A.3.5 Proprietary Data.................................................................................

A.4 Test Specifications................................................................................

A.4.1 Requirements......................................................................................

A.4.2 Hardware Configuration and Design................................................

A.4.3 Software System Functions...............................................................

A.4.4 Test Case Design..............................................................................

A.4.4.1 Hardware Qualitative Examination Design.........................................

A.4.4.2 Hardware Environmental Test Case Design......................................

A.4.4.3 Software Module Test Case Design and Data...................................

A.4.4.4 Software Functional Test Case Design..............................................

A.4.4.5 System-level Test Case Design.........................................................

A.5 Test Data................................................................................................

A.5.1 Data Recording................................................................................

A.5.2 Test Data Criteria.............................................................................

A.5.3 Test Data Reduction........................................................................

A.6 Test Procedure and Conditions.........................................................

A.6.1 Facility Requirements......................................................................

A.6.2 Test Set-up........................................................................................

A.6.3 Test Sequence.................................................................................

A.6.4 Test Operations Procedures...........................................................

 

 


A                                                             Appendix A: Qualification Test Plan

 

This Appendix contains a recommended outline for the Qualification Test Plan, which is to be prepared by the test agency. The primary purpose of the test plan is to document the test agency's development of the complete or partial qualification test. A sample outline of a Qualification Test Plan is illustrated in Figure A-1 at the end of this Appendix.

It is intended that the test agency use this Appendix as a guide in preparing a detailed test plan, and that the scope and detail of the requirements for qualification be tailored to the type of hardware, and the design and complexity of the software being tested. Required hardware tests are defined in Section 4, whereas software and system-level tests must be developed based on the vendor prequalification tests and information available on the specific software's physical and functional configuration.

Prior to development of any test plan, the test agency must obtain the Technical Data Package (TDP) from the vendor submitting the voting system for qualification. The TDP contains information necessary to the development of a Qualification Test Plan, such as the vendor's Hardware Specifications, Software Specifications, System Operating Manual and System Maintenance Manual.

It is foreseen that vendors may submit some voting systems in use at the time the standards are issued to partial qualification tests. It is also specified by the standards that voting systems incorporating the vendor's software and off-the-shelf hardware need only be submitted for software and system-level tests. Requalification of systems with modified software or hardware is also anticipated. The test agency shall alter the test plan outline as required by these situations.

The following sections describe the individual sections of the recommended Qualification Test Plan.

A.1                     Introduction

The test agency shall include the identification, and a brief description of, the hardware and software to be tested, and any special considerations that affect the test design and procedure.

A.1.1            References

The test agency shall list all documents that contain material used in preparing the test plan. This list shall include specific reference to applicable portions of the standards, and to the vendor's TDP.

A.1.2            Terms and Abbreviations

The test agency shall list and define all terms and phrases relevant to the hardware, the software, or the test plan.

A.2                     Prequalification Tests

 

A.2.1            Prequalification Test Activity

The test agency shall evaluate vendor tests, or other agency tests in determining the scope of testing required for system qualification. Prequalification test activities may be particularly useful in designing software functional test cases and tests of system security.

A.2.2            Prequalification Test Results

The ITA shall summarize prequalification test results that support the discussion of the preceding section.

A.3                     Materials Required for Testing

 

A.3.1            Software

The ITA shall list all software required for the performance of hardware, software, telecommunications, security and integrated system tests. If the test environment requires supporting software such as operating systems, compilers, assemblers, or database managers, then this software shall also be listed.

A.3.2            Equipment

The ITA shall list all equipment required for the performance of the hardware, software, telecommunications, security and integrated system tests. This list shall include system hardware, general purpose data processing and communications equipment, and test instrumentation, as required.

A.3.3            Test Materials

The ITA shall list all test materials required in the performance of the test including, as applicable, test ballot layout and generation materials, test ballot sheets, test ballot cards and control cards, standard and optional output data report formats, and any other materials used to simulate preparation for and conduct of elections.

A.3.4            Deliverable Materials

The ITA shall list all documents and materials to be delivered as a part of the system, such as:

·         Hardware specification;

·         Software specification;

·         Voter, operator, and hardware and software maintenance manuals;

·         Program listings, facsimile ballots, tapes; and

·         Sample output report formats.

A.3.5            Proprietary Data

The ITA shall list and describe all documentation and data that are the private property of the vendor, and hence are subject to restrictions with respect to ITA use, release, or disclosure.

A.4                     Test Specifications

 

A.4.1            Requirements

The ITA shall cite the pertinent hardware qualitative examinations and quantitative tests that follow from Volume I, Sections 3 and 9 of the standard. The ITA shall also describe the specific test requirements that follow from the design of the software and telecommunications capabilities under test.

The qualification test shall include ITA consideration of hardware, software and telecommunications, design; and ITA development and conduct of all tests to demonstrate satisfactory performance. Environmental, non-operating tests shall be performed in the categories of simulated environmental conditions specified by the vendor or user requesting the tests. Environmental operating tests shall be performed under varying temperatures. Other functional tests shall be conducted in an environment that simulates, as nearly as possible, the intended use environment.

Test hardware and software shall be identical to that designed to be used together in the voting system, except that software intended for use with general-purpose off-the-shelf hardware may be tested using any equivalent equipment capable of supporting its operation and functions.

A.4.2            Hardware Configuration and Design

The ITA shall document the hardware configuration and design in detail sufficient to identify the specific equipment being tested. This document shall provide a basis for the specific test design and include a brief description of the intended use of the hardware.

A.4.3            Software System Functions

The ITA shall describe the software functions in sufficient detail to provide a foundation for selecting the test case designs and conditions contained in Subsections A.4.4.3, A.4.4.4, and A.4.4.5, below. On the basis of this test case design, the ITA shall prepare a table delineating software functions and how each shall be tested.

A.4.4            Test Case Design

 

A.4.4.1            Hardware Qualitative Examination Design

The ITA shall review the results, submitted by the vendor, of any previous examinations of the equipment to be tested. The results of these examinations shall be compared to the performance characteristics specified by Section 2 of the standards concerning the requirements for:

·         Overall system capabilities;

·         Pre-voting functions;

·         Voting functions; and

·         Post-voting functions.

In the event that a review of the results of previous examinations indicates problem areas, the test agency shall provide a description of further examinations required prior to conducting the environmental and system-level tests. If no previous examinations have been performed, or records of these tests are not available, the test agency shall specify the appropriate tests to be used in the examination.

A.4.4.2            Hardware Environmental Test Case Design

The ITA shall review the documentation, submitted by the vendor, of the results and design of any previous environmental tests of the equipment submitted for testing. The test design and results shall be compared to the qualification tests described in Volume I, Section 9 of the standards. The test agency shall cite any additional tests required, based on this review and those tests requested by the vendor or the state. The test agency shall also cite any environmental tests of Section 9 that are not to be conducted, and note the reasons why.

For complete qualification, environmental tests shall include the following tests, depending upon the design and intended use of the hardware.

a.       Non-operating tests, including the:

1)      Bench handling test;

2)      Vibration test;

3)      Low temperature test;

4)      High temperature test; and

5)      Humidity test.

b.                   Operating tests involving a series of procedures that test system reliability and accuracy under various temperatures and voltages relevant to election use.

A.4.4.3            Software Module Test Case Design and Data

The test agency shall review the vendor's program analysis, documentation, and, if available, module test case design. The test agency shall evaluate the test cases for each module, with respect to flow control parameters and data on both entry and exit. All discrepancies between the Software Specifications and the test case design shall be corrected by the vendor prior to initiation of the qualification test.

If the vendor's module test case design does not provide conclusive coverage of all program paths, then the test agency shall perform an independent analysis to assess the frequency and consequence of error of the untested paths. The ITA shall design additional module test cases, as required, to provide coverage of all modules containing untested paths with potential for untrapped errors.

The test agency shall also review the vendor's module test data in order to verify that the requirements of the Software Specifications have been demonstrated by the data.

In the event that the vendor's module test data are insufficient, the test agency shall provide a description of additional module tests, prerequisite to the initiation of functional tests.

A.4.4.4            Software Functional Test Case Design

The test agency shall review the vendor's test plans and data to verify that the individual performance requirements described in Volume II, Section 2, Technical Data Package, Subsection 2.5.3.5, Software Functional Specification are reflected in the software.

As a part of this process, the test agency shall review the vendor's functional test case designs. The test agency shall prepare a detailed matrix of system functions and the test cases that exercise them. The test agency shall also prepare a test procedure describing all test ballots, operator procedures, and the data content of output reports. Abnormal input data and operator actions shall be defined. Test cases shall also be designed to verify that the system is able to handle and recover from these abnormal conditions.

The vendor's test case design may be evaluated by any standard or special method appropriate; however, emphasis shall be placed on those functions where the vendor data on module development reflects significant debugging problems, and on functional tests that resulted in disproportionately high error rates.

The test agency shall define ACCEPT/REJECT criteria for qualification using the Software Specifications and, if the software runs on special hardware, the associated Hardware Specifications to determine acceptable ranges of performance.

The test agency shall describe the functional tests to be performed. Depending upon the design and intended use of the voting system, all or part of the functions listed below shall be tested.

a.       Ballot preparation subsystem;

b.       Test operations performed prior to, during, and after processing of ballots, including:

1)      Logic tests to verify interpretation of ballot styles, and recognition of precincts to be processed;

2)      Accuracy tests to verify ballot reading accuracy;

3)      Status tests to verify equipment statement and memory contents;

4)      Report generation to produce test output data; and

5)      Report generation to produce audit data records.

c.       Procedures applicable to equipment used in the polling place for:

1)      Opening the polling place and enabling the acceptance of ballots; (b) maintaining a count of processed ballots;

2)      Monitoring equipment status;

3)      Verifying equipment response to operator input commands;

4)      Generating real-time audit messages;

5)      Closing the polling place and disabling the acceptance of ballots;

6)      Generating election data reports;

7)      Transfer of ballot counting equipment, or a detachable memory module, to a central counting location; and

8)      Electronic transmission of election data to a central counting location.

d.       Procedures applicable to equipment used in a central counting place:

1)      Initiating the processing of a ballot deck or PMD for one or more pre­cincts;

2)      Monitoring equipment status;

3)      Verifying equipment response to operator input commands;

4)      Verifying interaction with peripheral equipment, or other data processing systems;

5)      Generating real-time audit messages;

6)      Generating precinct-level election data reports;

7)      Generating summary election data reports;

8)      Transfer of a detachable memory module to other processing equipment;

9)      Electronic transmission of data to other processing equipment; and

10)   Producing output data for interrogation by external display devices.

A.4.4.5            System-level Test Case Design

The test agency shall provide a description of system tests of both the software and hardware. For software, these tests shall be designed according the stated design objective without consideration of its functional specification. The test agency shall independently prepare the system test cases to assess the response of the hardware and software to a range of conditions, such as:

·         Volume tests: These tests investigate the system's response to processing more than the expected number of ballots/voters per precinct, to processing more than the expected number of precincts, or to any other similar conditions that tend to overload the system's capacity to process, store, and report data;

·         Stress tests: These tests investigate the system's response to transient overload conditions. Polling place devices shall be subjected to ballot processing at the high volume rates at which the equipment can be operated to evaluate software response to hardware-generated interrupts and wait states. Cen­tral counting systems shall be subjected to similar overloads, including, for systems that support more than one card reader, continuous processing through all readers simultaneously;

·         Usability tests: These tests are designed to exercise characteristics of the software such as response to input control or text syntax errors, error message content, audit message content, and other features contained in the software design objectives but not directly related to a functional specification;

·         Accessibility tests: These tests are designed to exercise system capabilities and features intended for use by voters with disabilities in accordance with Volume I, Section 2.2.5;

·         Security tests: These tests are designed to defeat the security provisions of the system including modification or disruption  of pre-voting, voting, and post voting processing; unauthorized access to, deletion, or modification of data, including audit trail data; and modification or elimination of security mechanisms;

·         Performance tests: These tests verify accuracy, processing rate, ballot format handling capability, and other performance attributes claimed by the vendor; and

·         Recovery tests: These tests verify the ability of the system to recover from hardware and data errors.

A.5                     Test Data

 

A.5.1            Data Recording

The test agency shall identify all data recording requirements (e.g.; what is to be measured, how tests and results are to be recorded). The test agency shall also design or approve the design of forms or other recording media to be employed. The test agency shall supply any special instrumentation (pulse measuring device) needed to satisfy the data requirements.

A.5.2            Test Data Criteria

The test agency shall describe the criteria against which test results will be evaluated, such as the following:

·         Tolerances: These criteria define the acceptable range for system performance. These tolerances shall be derived from the applicable hardware performance requirements contained in Volume I, Section 3, Hardware Standards.

·         Samples: These criteria define the minimum number of combinations or alternatives of input and output conditions that can be exercised to constitute an acceptable test of the parameters involved.

·         Events: These criteria define the maximum number of interrupts, halts or other system breaks that may occur due to nontest conditions. This count shall not include events from which recovery occurs automatically or where a relevant status message is displayed.

A.5.3            Test Data Reduction

The test agency shall describe the techniques to be used for processing test data. These techniques may include manual, semi-automatic, or fully automatic reduction procedures. However, semi-automatic and automatic procedures shall have been shown to be capable of handling the test data accurately and properly. They shall also produce an item-by-item comparison of the data and the embedded acceptance criteria as output.

A.6                     Test Procedure and Conditions

The test agency shall describe the test conditions and procedures for performing the tests. If tests are not to be performed in random order, this section shall contain the rationale for the required sequence, and the criteria that must be met, before the sequence can be continued. This section shall also describe the procedure for setting up the equipment in which the software will be tested, for system initialization, and for performing the tests. Each of the following sections that contains a description of a test procedure shall also contain a statement of the criteria by which readiness and successful completion shall be indicated and measured.

A.6.1            Facility Requirements

The test agency shall describe the space, equipment, instrumentation, utilities, manpower, and other resources required to support the test program.

A.6.2            Test Set-up

The test agency shall describe the procedure for arranging and connecting the system hardware with the supporting hardware and telecommunications equipment, if applicable. It shall also describe the procedure required to initialize the system, and to verify that it is ready to be tested.

A.6.3            Test Sequence

The test agency shall state any restrictions on the grouping or sequence of tests in this section.

A.6.4            Test Operations Procedures

The test agency shall provide the step-by-step procedures for each test case to be conducted. Each step shall be assigned a test step number and this number, along with critical test data and test procedures information, shall be tabulated onto a test report form for test control and the recording of test results.

In this section, the test agency shall also identify all test operations personnel, and their respective duties. In the event that the operator procedure is not defined in the vendor's operations or user manual, the test agency shall also provide a description of the procedures to be followed by the test personnel.


 

Figure A-1                                               Test Plan Outline

1          Introduction

1.1        References

1.2        Terms and Abbreviations

 

2          Prequalification Tests

2.1        Prequalification Test Activity

2.2        Prequalification Test Results

 

3          Materials Required for Testing

3.1        Software

3.2        Equipment

3.3        Test Materials

3.4        Deliverable Materials

3.5        Proprietary Data

 

4          Test Specification

4.1        Requirements

4.2        Hardware Configuration and Design

4.3        Software System Functions

4.4        Test Case Design

4.4.1     Hardware Qualitative Examination Design

4.4.2     Hardware Environmental Test Case Design

4.4.3     Software Module Test Case Design and Data

4.4.4     Software Functional Test Case Design and Data

4.4.5     System-level Test Case Design

 

5          Test Data

5.1        Data Recording

5.2        Test Data Criteria

5.3        Test Data Reduction

 

6          Test Procedure and Conditions

6.1        Facility Requirements

6.2        Test Set-up

6.3        Test Sequence

6.4        Test Operations Procedures