Aerospace & Defence
The verification and validation of software are therefore not merely technical processes but respond to an operating philosophy centred upon two key needs: a) to check whether the software that the customers expected and described at the start has been produced;
b) to validate the construction from the technical point of view and with respect to the conditions laid down.
In the Defence and Aerospace area, Exprivia has important competencies in the Verification and Validation process (software testing for short), where it works in collaboration with university IT departments.
The main objectives of the software testing procedure consist in guaranteeing the independence of the test team from those of the development team, finding and managing any errors in the SW, providing data to assess the progress of its development autonomously and together with the customer during the life cycle or part of it, providing a product and/or software system certification inclusive of complete traceability of the errors found during the tests.
The validation of the software consists of two steps: assessment of the artefacts and change control.
- Assessment of the artefacts: continuous and iterative assessment of the artefacts (documentary and software) resulting from the implementation of the system and/or software requirements with respect to the User Requirements (if present) through internal analysis and inspection techniques and/or through reviews agreed upon with the customer or end users;
- Change control: in case of a change in the user requirements, system requirements, software requirements, at any stage of the life cycle of the software, activate an investigation into the impacts that these changes have on the entire system.
Despite the limited spread of software testing culture in the companies, the verification and validation processes offer major advantages at affordable costs:
- find defects in the software as early as in the initial phase of development, obtaining advantages in terms of stability of the software and a substantial saving in terms of trouble-shooting costs (software defects early detection);
- increase control of the entire life cycle of the software, through a process that enables software errors to be managed in an organized way and guarantees the traceability of the correlation between errors, the requirements and the tests in which they were found;
- set up a positive feedback cycle between the testing team, designers and developers, to correct the errors found in a short time;
- produce documents and certificates that demonstrate the correctness and validity of the software with respect to criteria established together with the customer, or with respect to criteria and levels required by international organizations and/or standards.
Definition of the documentary standard. The documentary standard to be used throughout the life cycle is chosen: it is a set of documentary templates on which the description of the entire system and the activities related to it (plans) are to be based.
- MIL-STD-498, complete documentary standard of military origin, which covers all the aspects associated with the life cycle of the software;
- standard already present at the customer's site, which can be adopted as it is or purchased and modified according to the new needs;
- ad hoc standard, to be defined from scratch together with the customer.
Definition of the workflow
This activity defines the procedure for accepting the documentary items and the software and system items, archiving them, together with versioning system to be adopted and the signature levels required for the items. This phase also defines the workflow for the formal management of Problem Reports.
Identification of the quality assurance toolchain
Choice of the software tools that support the quality assurance activities, such as the acceptance of the items, management of problem reports, software change proposals, etc. (e.g. SAP, Polarion, Redmine).
Acquisition of standards
This activity collects information about the legislation involved (if present), the standards adopted in the customer's sector/domain, the international technical standards involved (ISO, IEEE, …) in the drawing up of specifications.
This activity defines the formalism adopted for the specifications, levels of detail of the specifications and the software system levels involved in agreement with the customer:
- the formalism concerns the language or technique selected to write the specifications: use can be made of informal specifications (paragraphs written in natural language), or semiformal languages such as UML2 (Unified Modeling Language) or derived products, or formal specification languages (formal mathematized notations, such as Z-notation);
- the levels of detail available comprise user needs, features and software requirements;
- the system levels managed: system, subsystem and software module. Formalisms, levels of detail and system levels can be combined on the basis of the needs and objectives.
During collection of the specifications, direct support is given to the customer in the practical definition of the requirements, in order to improve their clarity and completeness.
Defining the environment
On the basis of the domain of the target system, the needs for the test environment from the point of view of the hardware involved and the software necessary for the various testing activities are defined, thus describing the environment through the actual specifications.
Toolchain and testing technologies
The technologies required to set up the test environment and the tools necessary to support the testing activities are identified. An investigation can be conducted into the technologies and tools used at the customer site with a view to reusing them.
If requested by the customer requests it or the need arises for some other reason, a platform for the original testing, which integrates with the technologies and tools identified, can be set up. It may be commissioned by the customer (and then sold to him); products present at our company can also be adapted if applicable.
Planning the testing activities
Phase in which the test sessions are scheduled, estimating efforts in terms of resources and times. The description of these activities converge in plans described using the documentary standards identified.
Defining the test cases
In this phase, test designers design and implement the test cases that cover the requirements. The definition includes a continuous (iterative) refinement of the tests, also through local execution.
Running the test sessions
This activity consists in running the test sessions planned in the test environment defined. They can be semi-formal like dry runs, or formal, like module qualifications, where required. These activities can be repeated if supported by the scheduling activity or if a total or partial failure was obtained (a selected subset of them can be repeated).
Managing problem reports
When problems are found in the software being tested, during the definition of the test cases and the running of the test sessions, these problems give rise to a uniquely identified PR (Problem Report). The PRs are managed using tools and the workflow is defined in the preliminary phases.
The technologies used on software verification platforms for automatic testing comprise technologies for managing the workflow of the application, documentary and test items, languages for developing the test platform and the test procedures, frameworks supporting the test platform (integration and messaging), tools for recording and playing back actions on a Java GUI (Swing, JavaFX), QT, Web, Set of libraries supporting Model2Text transformation activities (from UML2 model to Code), Vmware virtualization technologies – in order to take full advantage of the available hardware, enable rapid deployment, fault tolerance, etc..