ADCSS 2008

ESA Workshop on Avionics Data, Control and Software Systems (ADCSS)

29 - 31 October 2008

ESA/ESTEC Noordwijk, The Netherlands

Organised by the European Space Agency

INTRODUCTION (MOTIVATION)

Following on the 2007 ESA Workshop on Avionics Data, Control and Software Systems (ADCSS07) this year's ADCSS will cover relevant avionics topics in the form of round tables. The workshop provides a forum for position papers and for significant interaction between the organisers and the participants. The status of ongoing initiatives shall be presented. The conclusions of the workshop will be used as a basis for future actions in ESA's technology R&D plans in this domain.

ORGANISATION

The Workshop includes the following round tables:

SPACE AVIONICS OPEN INTERFACE ARCHITECTURE (SAVOIR)

This round table will be held on Wednesday 29 October 2008.

Objectives Space industry has recognized for quite some time already the need to raise the level of standardisation in the avionics system in order to increase efficiency, reduce cost and schedule in the development and thereby increase competitiveness. The Space Avionics Open Interface Architecture Initiative is the response to this need and encompasses many ongoing and planned efforts towards this vision: CORDET, DOMENG, SOIS, AOCS system studies, ASSERT, spin-in from IMA and AUTOSAR etc...

The objective of the round table is to review the status of these initiatives, to report on the progress made since the Avionics Reference Architectures round table (ADCSS 07 workshop) and to plan the way ahead.

Topics Briefings and position papers, selected by appointment, will cover the following topics:

* Results of the CORDET (Component Oriented Development Techniques) and DOMENG (Domain Engineering) activities * Status of the CCSDS SOIS standardisation initiative * AOCS system architecture studies * ASSERT (Automated System and Software Engineering for Real-Time Applications) exploitation plan * Time and Space Partitioning (conclusions of the working group on this subject) * Status of IMA related activities * Outcome of the AUTOSAR study and way forward * R&D proposals on building blocks * Further steps required towards the vision of an Space Avionics Open Interface Architecture

Organisation The round table will include briefings and position papers by appointment and an open discussion with the audience.

Abstract Submission Please submit your abstract through the online submission form available here.

Round Table Chairs Philippe Armbruster 	Philippe.Armbruster (at) esa.int Alain Benoit 	Alain.Benoit (at) esa.int Juan Miro 	Juan.Miro (at) esa.int

Round Table Organiser Andreas Jung 	Andreas.Jung (at) esa.int

MODEL BASED SOFTWARE ENGINEERING

This round Table will be held on Thursday 30 October 2008.

Objectives The use of "models" is becoming state-of-the-art in the software engineering field. The model based development approach is also more and more being applied to space on-board software, from system software co-engineering, up to operations, through development and verification.

There are many different types of models. While software design models are close to the implementation, other models address requirements or operations. The suitability of models for different types of domains needs to be assessed. We need to identify them, to relate them to each other in order to evaluate the emergence of a new life cycle, parallel to - or even replacing - the software life cycle: the model life cycle. Further background information can be found here.

The goal of the round table is to review the state-of-the-art in model-based software engineering and to facilitate and promote the adoption of this paradigm for space onboard software in a way compatible with the needs and constrains specific to this domain.

Topics The round table shall focus on the following issues/questions:

* Experience and lessons learnt from the use of modelling techniques in onboard software development * Maturity of current methodology and tools in support of the model based software engineering (SysML, [HRT-]UML, OCL, assertions, AADL, SDL, Scade, Matlab, etc), and how to apply them to the onboard software development process * What is the new (model) life cycle and how does it impact current practices? * What is the current coverage of this process by supporting methodology and tools * What is the added value of deploying fully or partly the Model Based Software Engineering process wrt the current practices for onboard software development? * What is blocking us from a wider use of model-based approaches? (maturity, coverage, cost...) * What R&D actions are required to facilitate the adoption of such a process? * How to ensure model consistency along the life cycle of avionics and software? Is there a life cycle for models?

Industry and Academia are invited to submit 1 page abstracts for position papers focusing on the topics listed above. The abstract shall clearly outline the content of the presentation (targeted at 15 min) and the specific question(s) addressed.

Organisation The round table will include presentations of position papers and discussion with the audience.

Deadlines Abstracts for position presentations shall be submitted by 29 August 2008. Upon acceptance, ppt presentations shall be requested by 15 October.

Abstract Submission Please submit your abstract through the online submission form available here.

Round Table Chairs Jean-Loup Terraillon 	Jean-Loup.Terraillon (at) esa.int Kjeld Hjortnaes 	Kjeld.Hjortnaes (at) esa.int

Round Table Organiser Yuri Yushtein 	Yuri.Yushtein (at) esa.int

Background information: Model Based Software Engineering

Models are an abstraction of the reality captured in a specific representation format i.e. diagram or language. With the emergence of new tools these model representations can be constructed, translated and exploited in different ways:

* Analysis: various types of model checking e.g. completeness and consistency analysis * Simulation: execution of model, behaviour can be simulated * Design: decomposition of system in smaller components, establishing interfaces (inherent to modelling ) * Coding: automatic generation of source code e.g. C or Ada (auto-coding) * Testing: automatic generation of tests (auto-testing) * Proving: formal verification or proof

Models for Requirements Engineering When translating the textual requirements into a model using e.g. formal modelling language, various analyses can be performed that enhance the requirement "quality" in terms completeness and consistency (when results are fed back). This is equivalent to static code analysis. Executing the model enables dynamic analysis. Used with operational scenarios, simulation can be performed, increasing the knowledge about the system and enabling the enhancement of the requirement quality. To verify correctness and consistency of the requirements, the requirements models can be subjected to the formal analysis, such as model checking/proving. The model properties to be verified may reflect the software operation perspective of the requirements, as well as system perspective (i.e. the embedded part of the avionics). Although the benefit of such an approach is obvious, it is not applied systematically in space projects today. There are several issues:
 * 1) What is the scalability (in terms of code size or complexity) of model checking?
 * 2) What are the limitations?
 * 3) Is this technique cost efficient?

The time window during which modelling is efficient (for verification) is very narrow. Too early before requirement elicitation, you don't know what to model and you may waste resources. Too late, and the design is started and the findings are most often not taken into account. The modelling activity should come when the system engineers write their requirements, as early prototype to mature them. Is it realistic to have a project organization with such a setup? Which property to verify? Apart from the usual verification of absence of deadlock and live lock, it is possible to check general properties of the systems, generally expressed as a sequence of events. Is there enough experience in the definition of these system properties? Is this foreseen in the process? Why is the use of models for requirements verification not systematic?

Models for design Modelling inherently involves designing i.e. decomposition and layering. However it is unclear if model design can handle all necessary elements and constraints which are needed throughout the full life cycle. The model language has been chosen because it matches better with the problem domain of the user. As a consequence the model language may abstract from implementation details which are not important to the user at a particular stage. These details will be important at a later stage though. An embedded system is typically limited with respect to execution time (CPU load) and memory usage. These constrains in turn will impact the choice of certain algorithms, scheduling mechanism or the use of legacy code, which influence the architectural design. In other words a model used for requirements engineering might be inappropriate for automatic coding. A specific issue is the modeling of tasking. How are these properties being handled?

Models for coding Auto-coding from models seems to be mature and certified generators available. However, on target validation has to be handled separately. Is it cost efficient to use a qualified code generator or to run a validation test suite on the target? The Auto-coding also enables the automatic traceability from code to design and requirements. However implementation issues must be handled specifically and early in the life-cycle. The use of auto-coding in the system engineering phase shifts some aspects of the software development process to the system engineering process. What are these aspects? How does it impact the overall process? Software coding standards and guidelines for programming language are well established and can be enforced through tools. For modelling tools a similar standards and guidelines as well as checking tools exist but not yet matured. The semantics of programming like C or Ada are well defined by independent standards (e.g. ANSI). The semantics of model languages however are defined by the tool vendor and can also be changed by them; modelling languages are tool vendor specific. Also the output of the code generator can differ considerably between versions of the tool. This might create a software maintenance problem.

Models for verification/validation The use of model analysis, simulation and model checking/proving can already verify certain aspects of the system. For example tests can be manually defined and verified when the model is executed. However, it is also possible to generate tests automatically. Auto testing can be categorized in: Requirement-based testing, Model-based testing and Statistical testing. Recent developments have made it possible to automatically generate test cases from requirements captured as formal properties (property-based testing) or from specific behaviour models. A problem is that the varying underlying mathematical methods are proprietary and difficult to assess. Model-Based Testing (MBT) is the automatic generation of test procedures/vectors derived from models. Model based testing can largely replace (manual) unit and integration testing; it can achieve a high percentage of coverage. In Statistical testing (e.g. Monte Carlo), the test cases are selected from the input domain (or predefined range) randomly based on some probability distribution. Statistical testing reveals non-anticipated faults as opposed to deterministic testing which mainly addresses anticipated faults, as perceived by the engineer or tool. The benefits of statistical testing are specifically acknowledged for robustness testing. This technique suffers some problems: interesting/important subset
 * 1) Oracle problem: the test outputs must be verified against pass/fail criteria
 * 2) Small target problem: in the vast test input space there is only a small

The technique of auto-testing does deserve closer attention. How well is it suited? Could the different test technique be combined efficiently? Is the claim that verification-testing can largely be automated while validation testing can be partly automated, justified? Test generation is based on models, while tests are performed on implementation. What impact does it have on code generation? What are the issues in case automatic code generation from models is not employed? Is the test generation technology mature and usable by industry (from SDL, from UML+OCL, other)? What should be the strategy for complementary use of formal model verification and automatic test generation?

Models for dependability/autonomy/operations Models are used also to verify the dependability of hardware and software. Hardware models are more oriented towards probability of failures (e.g. Stochastic Petri Nets), while software models can represent software failure modes and recovery logic. Model checking allows verifying if a system feared event can originate from a software error. This more systematic failure modelling can be used for trouble shooting, for generation of FDIR, on ground. But it can also be used to manage unpredicted events on board. Model checking techniques deployed in on-board software can isolate the failure, implement autonomy concept and perform planning and scheduling. Finally models can be used to improve the spacecraft operation and the training of operators. Are these models related? Are these uses of models mature today? What should be done to improve them? Formal methods for security As security enters now in the scope of on-board software, the question of the evaluation of security level arises, and formal methods are needed above the EAL 4. technologies? security?
 * 1) Can the security aspects be adequately modelled with the readily available
 * 1) How can model checking serve this purpose?
 * 2) Can the same models used for functional verification, for dependability, be used for

THE ISVV PROCESS IMPROVEMENT

This Round Table will be held on Friday 31 October 2008.

Objectives The current ESA ISVV process was discussed in a dedicated session at the SDSS-2005 workshop followed by the release of the ESA ISVV guide in Nov-2005. Since then the ESA ISVV guide has been used in a number of projects and additionally, a number of R&D activities have been performed to further extend and validate the defined process.

Furthermore, there is a push to extend the Independent Verification process to not only focus on the software product but also on the requirement baseline driving the software development. The requirement baseline is the User Requirements and thereby the output of the system engineering process, which drives the development of the software product. This wish to extend the verification process to earlier phases of the lifecycle is a reflection on the generally accepted fact that one of the most critical issues is the establishment of a consolidated, consistent and complete requirement baseline.

It is therefore time to take stock of the experiences gained so far for the purpose of improving the ISVV process and revising the ESA ISVV guide if necessary. It is also essential to discuss the extension of the ISVV Guide to include Model Driven Design (e.g. autocoding) and potentially the extension of the scope of Independent Verification toward earlier phases of the life-cycle.

The objective of this round table is to present the lessons learned so far and to discuss the improvements to be made as well as open areas and set priorities for the future. It is also the intention to discuss elements of an ESA ISVV policy on the applicability of the ISVV activity to future projects.

Topics Emphasis of presentations shall be on the ISVV process improvement and potential extension, thereby addressing the primary objective:

"How to best achieve a cost effective ISVV process that provides added value to the overall quality of the end product?"

This can be further broken down into sub-items as   earlier phases of the lifecycle. requirement baseline verification.
 * 1) Improvement/revision to the ISVV process as defined in the ESA ISVV Guide.
 * 2) Discussion on the usefulness of extending the Independent Verification process to
 * 1) The use of Model Driven Engineering methods as a means of performing independent

Industry and Academia are invited to submit 1 page abstracts for position papers focusing on the topics listed above. The abstract shall clearly outline the content of the presentation (targeted at 15 min) and the specific question(s) addressed.

Organisation The round table will include presentations of position papers and discussion with the audience.

Deadlines Proposals for position presentations in the form of an Abstract shall be submitted by 29 August 2008. Upon acceptance, ppt presentations shall be requested by 15 October.

Abstract Submission Please submit your abstract through the online submission form available here.

Round Table Chair Kjeld Hjortnaes 	Kjeld.Hjortnaes (at) esa.int

Round Table Organiser Sabine Krüger 	Sabine.Krueger (at) esa.int

PROGRAMME COMMITTEE

Philippe Armbruster Data Systems Division Alain Benoit Control Systems Division Kjeld Hjortnaes Software Systems Division Roger Jansson Control Systems Division Juan Miró Software Systems Division Patrick Plancke Data Systems Division Jean-Loup Terraillon Software Systems Division This CfP was obtained from WikiCFP