Jon D. Hagar's Published Papers

(list last updated 2009, contact Jon to obtain copies of papers)

A Technique for Testing Highly Reliable Real-Time Software

Abstract: The engineering of software systems that must be highly reliable is very difficult, and support tools and techniques are clearly needed. We are developing a technique and an associated tool set that uses executable specifications based on Annotated Ada (Anna) for software testing in hard real-time environments. Our initial tool, the test range oracle tool (TROT), supports the creation of simple test oracles that check the correctness of equation execution; future tools will have expanded capabilities. TROT was designed using commercial-off-the-shelf and reuse software to fit in with an on-going software development process and does not interfere with the software under test.

Use of Formal Specifications as Test Oracles for System-Critical Software

Abstract: The process used to validate, verify, and test flight avionics control systems has produced software that is highly reliable. However, ever greater demands for reliability require new automated tools to improve existing processes. We used the Anna (Annotated Ada) formal specification language and supporting tool set to develop a Test Range Oracle Tool (TROT) to automate the testing of equation execution. Our approach fits within the existing testing process, automates perviously manual analysis, and can increase the level of test coverage. The TROT approach also introduces the use of formal specification languages and supporting tools to an existing industry program. This approach supported production tests and is being expanded into other test support areas.

 



 

How to Build a 20-Year Successful Independent Verification and Validation (IV &V) Program for the Next Millennium

Abstract: IV&V is common on many critical software government programs. IV & V can save missions and improve the product quality when done right. This means that there must be a balance of people with the right skills, management, processes aimed at verification of each life cycle step, validation with simulation and analysis tools, and a hardware-based test facility. Automation of testing and IV&V helps to reduce cost, but automation is only part of a complete program. While IV&V is not for every software program, high risk projects where there are critical cost or safety factors can benefit from some level of IV&V.

Testing Critical Software: Practical Experiences

Abstract: This paper presents our experiences in testing critical software that supports flight systems developed by Lockheed Martin Astronautics in Denver, Colorado. This approach has not been proven in an academic sense, but has been demonstrated over the years to result in software that successfully performs missions. It is based on teams comprised of the correct skill balance in software and systems engineering, as well as using a defined process.

Lessons Learned from Incorporation of Commerical Computer Aided Software Engineering Tools in a Flight Critical Software Test Environment

Abstract: In flight critical, software intensive, avionics systems, a major technical and managerial component is the testing and analysis of the developed software. In many of these systems, the software must be ultra-reliable and produced within schedule and budget. This paper examines how an existing successful software verification and validation project incorporated commercial computer aided software engineering (CASE) tools.

A Systems Approach to Software Test and Reliability

Abstract: An area of concern in Aerospace is reliability of real-time control software. Production systems are usually one of a kind that must work the first time or millions of dollars can be lost. To achieve “ultra-reliable” systems, current best practices in software testing at Martin Marietta Astronautics Group (MMAG) include Verification and Validation (V&V) of real-time flight control systems, based on a systems level view of the software. Key to achieving this is the use of an integrated multi-discipline team of problem domain experts to conduct the V&V testing.