Summary: | Approved for public release, distribution unlimited === Many U.S. Navy systems were built on the fly and have encountered interoperability problems at sea, such as erroneous dual/multiple track designations, misidentification/track identity conflicts, report responsibility conflicts, friendly tracks displayed as unknown/pending, tracks dropped without operator action, different track identities of at different ships, etc. To identify and fix these interoperability problems, the Navy instituted the Distributed Engineering Plant (DEP) testing program, run by the Naval Sea Systems Command (NAVSEA), and the End-to-End (E2E) testing initiative, currently formed by the Space and Naval Warfare Systems Command (SPAWAR). Whereas the DEP involves many land-based aboratories across the U.S. connected via an Asynchronous Transfer Mode (ATM) network, E2E testing is carried out entirely at one laboratory-the E2E lab. The DEP testing program is faced with the problem of determining a cost-effective way of paying for testing-providing the participant DEP laboratories full-time funding or paying them on a per-test basis. A challenge faced by the E2E testing program is getting the E2E lab ready for testing. Two factors contributing to this challenge are uncertain availability of funding for building the E2E lab and the lack of a comprehensive plan to establish the E2E lab. Such a plan calls for a rigorous justification of the E2E lab needs and hence funding requirements. This thesis performs an in-depth examination and a qualitative analysis of the two testing programs and a quantitative comparative analysis of the DEP testing program's paying options and, using goal programming, provides data in support of creating an E2E lab plan. The significance of this thesis is the use of analysis and mathematical programming to provide analytical data in supporting informed decision making in testing and evaluation of systems and/or systems of systems.
|