FCAS Expert Commission on the Responsible Use of Technologies


Prof. Reimund Neugebauer, President of the Fraunhofer-Gesellschaft, and Dirk Hoke, CEO of Airbus Defence and Space, on the FCAS Expert Commission on the responsible use of technologies.

What led you to set up an expert commission on the responsible use of technologies for the FCAS project?

Dirk Hoke: The Future Combat Air System – or FCAS for short – was initiated by the heads of state of France and Germany. In addition to its feasibility from a purely technological perspective, as well as its relevance for defence policy, I believe the programme is also important in terms of social policy. After all, it will make use of technologies such as artificial intelligence that, despite being cutting edge, also face critical discussion. And it’s important to have these discussions! As the industrial partner with chief responsibility for the German part of the project, Airbus attaches great importance to these issues and will of course consider the social, ethical and legal questions involved. This is the responsibility of the ‘AG Technikverantwortung’ expert commission.

Prof. Reimund Neugebauer: We at Fraunhofer pursue a number of initiatives aimed at securing German and European technological sovereignty – from battery cell production and data security topics to hydrogen technology and quantum computing. As a pioneering and extensive technology programme, FCAS is the perfect addition to this list. The impact of the project will continue to be felt well into the 21st century. Artificial intelligence and automation at all levels will be the key technologies in this system of systems made up of manned and unmanned aerial vehicles. These topics are hugely significant to Fraunhofer in other areas of technology, too.

In a programme such as FCAS, the ethical and legal challenges of digitalisation are both put under the microscope at the same time. If we are able to demonstrate that artificial intelligence can be used responsibly in a defence project, it will also be a huge gain for commercial applications. Trustworthy artificial intelligence is a key topic for Fraunhofer, as well as the associated regulations (and its ability to be regulated), the legal and ethical compliance or the robustness and security of the system, for example. It’s also fundamental to clarify the question of who is accountable. Both topics – trustworthiness and human accountability – are conditions for social acceptance, marketability, and therefore actually realising innovation potential.

The partnership between Airbus and Fraunhofer sounds unusual. What brought you together? 

Neugebauer: More than any other research facility, Fraunhofer attaches great importance to working with industry and adopts a strong, application-orientated approach to research. We have long been an established partner in aeronautical research. Furthermore, the institutes of the Fraunhofer Group for Defense and Security (Fraunhofer-Verbund Verteidigungs- und Sicherheitsforschung, VVS) have been working with Airbus for several years in a number of fields. And for some time now, the Fraunhofer Institute for Communication, Information Processing and Ergonomics (Fraunhofer-Institut für Kommunikation, Informationsverarbeitung und Ergonomie, FKIE) has been involved with implementing the national FCAS master plan, laying down the technical foundations for developing FCAS. This work resulted in the Fraunhofer VVS becoming a part of the FCAS industrial consortium last year.

Hoke: This partnership is the product of dialogue between Airbus and Fraunhofer along with the shared belief that a project like FCAS – which has never before been attempted in this form or scale – cannot be viewed from just a technological standpoint. In other words, if something seems technologically feasible, we still need to ensure it’s appropriate from an ethical and moral perspective, as well as under international law. We want to be involved in this discussion, in the knowledge that there are conflicting viewpoints that we would like to resolve as best we can. I’m glad we can address these issues together with an accomplished partner like Fraunhofer, along with a number of experts who will bring a wide variety of backgrounds and perspectives to the table during this important debate.

How is the work for this project being split up? 

Neugebauer: Generally, the division of work is such that Fraunhofer and industry complement one another in the best possible way, making optimum use of their respective activities and areas of expertise. Fraunhofer VVS is carrying out basic application-orientated research for the FCAS project with the aim of technical implementation. Here, a deep understanding of the Bundeswehr and its work, built up over several decades, is just as important as cutting-edge, security-related research in the interests of the resilience of our country and Europe.

What’s typical for Fraunhofer in this respect is how we gain a deep intellectual understanding of the ethical questions raised by security-related digitalisation technology. For example, findings such as these from an engineering and information-science perspective were presented at the Munich Security Conference 2020 in a publication by the German Institute for Defence and Strategic Studies. Together with Airbus and all relevant stakeholders, such considerations are being discussed with an open mind, in particular with representatives of civil society, and incorporated into the technical design for FCAS.

Hoke: Airbus is the industrial partner with system responsibility for FCAS in Germany. We therefore hold a leading position in terms of the technical feasibility and industrial implementation of the project. For us, FCAS is a future project that is pivotal to both Germany and Europe as a whole. This is also reflected in the people we have on the commission. Our Head of FCAS Bruno Fichefeux and Chief Architect FCAS Tom Grohs are permanent members of the commission’s experts panel. And I also try to join the meetings as often as possible.

What criteria are used to appoint the members of the commission and what is their role?

Hoke: First of all, we found it important to have people on board who could lay the groundwork for FCAS and ultimately make decisions. On the one hand, that includes the relevant players from the German Ministry of Defence. On the other, we were determined to get representatives from the German Federal Foreign Office on board, too. We also invited leading experts from foundations, universities and think tanks. Due to the European dimension of the FCAS project, our aim is to engage in dialogue with a view to the future and not to restrict discussions to Germany alone. This will no doubt open up new and exciting prospects.

Neugebauer: I totally agree, and in this respect I’d like to stress that the members of the commission are bound only by their conscience. The objective of this commission, which meets twice a year, is to discuss the issues at hand – both critically and from a number of different standpoints. Each of these meetings deals with specific questions. The results of these discussions are transparent, with the minutes made available to the public on a dedicated website created especially for this purpose.

What roles do you think ethics and moral standards play in one of the largest European defence projects of the future?

Neugebauer: I believe that both technology management and personal accountability have key roles to play in security-related technology. This applies in particular to systems that are enabled by artificial intelligence and extensive automation. It’s no accident that Fraunhofer is hugely involved in questions of ethics – including with regard to AI systems.

Hoke: Ethical and moral standards always play a part in our decision-making. On the one hand, we supply equipment to our European armed forces, making an important contribution to our countries’ security. On the other hand, programmes like FCAS are only sustainable for industry if they can generate certain economies of scale that justify and support the high levels of effort and expenditure involved. In other words, it must be possible to export the system, otherwise it will be incredibly difficult for European business to keep up with global competition. This means all those involved need to be aware of the requirements and objectives from the outset and to examine each relevant issue as thoroughly as possible. After all, there’s a lot at stake.

What objectives are you looking to reach with this commission? What do you hope to achieve? 

Neugebauer: For the very first time in the history of the Federal Republic of Germany, a major security technology project is being accompanied from the very beginning by an intellectual struggle regarding the technical implementation of ethical and legal fundamentals – also referred to as ‘ethical and legal compliance by design’. I hope we succeed in taking the findings of the multidisciplinary expert commission and putting them to operational use. Not only must our security, resilience and defence capabilities be credible from a technological perspective, but they must also be in line with the European values and convictions that we are defending.

Hoke: Well, a project of this nature has never been attempted before. So in that respect, we are assuming a pioneering role, and the response up to now has been positive. That gives me a lot of confidence. I am certain that a project like FCAS, whose development and implementation is planned to take place over the course of around 20 years, would be hard to implement today without any social discourse. After all, it’s also an issue of using taxpayers’ money responsibly. So for us it’s only logical to have this kind of forward-looking debate on security and defence policy covering the broadest social framework possible – and, I hope, to gain as good an understanding of the topic as possible as a result.