Artificial Intelligence, the Future Combat Air System and Shaping a Way Ahead

05/27/2020
By Pierre Tran

Paris – Cultural differences between France and Germany may pose problems over Airbus writing ethical concerns into artificial intelligence for the planned Future Combat Air System, a virtual introduction to a panel of experts heard May 14, 2020.

Airbus in Germany has set up a panel to consider ethical, legal and social aspects of building AI and automation into a planned FCAS, combining a new generation fighter, unmanned aircraft, and existing fighter jets in a complex network dubbed air combat cloud.

That panel was formed last year with Fraunhofer, a research institute, drawing on staff from the German defense and foreign ministries, institutes, universities, and think tanks. Those on the panel are not paid for their work.

The virtual presentation introduced panel members, who spoke of some of the issues to be examined, which included the cultural difference between France and Germany on armaments and AI.

“In many ways, FCAS represents a great leap forward,” Dirk Hoke, chief executive of Airbus Defence and Space, said in a May 14 statement.

“But there are also ethical and legal challenges which we have to address.”

Reimund Neugebauer, president of Fraunhofer, said FCAS would be the first time Germany would see “technical implementation of basic ethical and legal principles — ethical and legal compliance by design.”

Ethics were important but companies needed to win exports to stay competitive, Hoke said.

Ethics and moral standards were part of Airbus’s decision making, he said on the FCAS Forum website set up by Airbus and Franuhofer to address those issues.

Equipment was sold to European armed forces to ensure security, and programs such as FCAS needed economies of scale to pay for high level effort and expenditure, he said.

“In other words, it must be possible to export the system, otherwise it will be incredibly difficult for European business to keep up with global competition,” he said. “This means all those involved need to be aware of the requirements and objectives from the outset and to examine each relevant issue as thoroughly as possible.

“After all, there’s a lot at stake,” he said.

Those who logged onto the May 14 presentation of the panel heard that the aim was to hold discussions with the greatest transparency, with the FCAS Forum website to air the debate.

Arms programs sparked tension in the German parliament and the general public, said brigadier general Gerald Funke of the defense ministry, head of section planning and FCAS project leader.

It was important for civil society to have an ethics debate, in the context of the crises in health, budget, security threat, and climate change, said Ellen Uesberschär, who holds a doctorate in theology and specializes in ethics, digitalization and transformation of society.

“The deepest humiliation is to be killed by a machine,” she said.

“We have to go deeper into the question of risk, rights and responsibilities.”

There was need to ensure sufficient human control in weapons, said Bohn Rudiger, a senior diplomat.

There was an “excellent opportunity” for a European contribution on industry standards on AI and weapons, he said. There was debate in the European Union on liability on AI in civilian applications such as driving and liability in car crashes.

There was need to reconcile differences among the European partners, said Ulrike Franke, policy fellow at the European Council for Foreign Relations, a think tank. France, Germany and Spain were partners on FCAS, and other European nations may join the project.

There were “pronounced divergences” between France and Germany, with the former more open to on AI and autonomy, while the latter was more cautious, she said. The panel could explore the issues, which spanned compromise and setting red lines.

Transparency toward the public was a key factor and public opinion needed to be part of process, she said.

It was crucial to have a human as a “circuit breaker,” as there was real risk of escalation when working at machine speed, said Frank Sauer, a specialist in arms control and military use of AI. It was important to uphold the political will to involve humans, and incorporate that principle at the technical level in weapons, he said. It was important for Europe to get it right.

For Airbus, the duty was to “turn the technical vision into reality,” said Thomas Grohs, Airbus D&S chief architect for FCAS.

Funke said, “It is up to us” to design the red line, decide the role of human in the loop and on the loop, how to make sure humans have meaningful control.

Grohs said rules of engagement and concept of operations would serve as “tool sets,” reflecting legal and safety requirements. There were few ethical requirements for now and the forum would establish the requirement list.

In AI and neural networks, there could be a flexible, modular approach offering different modules to be loaded, reflecting ethical rules of the various users, he said.

Grohs was replying to a question on the engineering point of view from Wolfgang Koch, computer science professor at Bonn university.

There was need for humans in the loop, with humans as circuit breakers, Koch said. There was need for engineering to be “ethically compliant,” with digital systems used in a responsible way, in political and social terms.

Asked on how to regulate on autonomous weapons and deal with nations which do not observe regulation, Bohne said it was complex, but a step-by-step approach could be taken, leading to a regulated autonomy in weapon systems. It was important to take a prescriptive approach to get human control over autonomous systems.

On FCAS and NATO, Funke said the weapon would be “interconnected,” with the planned system designed to work with NATO systems.

Grohe said on the technical design, there was strong desire to work with NATO.

The ethical debate is presently on a national level and will be extended to a transnational discussion, the forum panel heard. The next panel meeting will be held in October.

In France, FCAS comes under the purview of an ethics committee on AI formed by the armed forces ministry, air force general Jean-Paul Breton said on his LinkedIn social media account. Breton heads the French operational study on FCAS.

The ethics committee has been asked to “contribute reflection on ethical issues taking into account the emergence of new technologies in the defense field such as the use of artificial intelligence,” he said. That reflexion will start as a national study.

The first meeting of the ethics committee took place Jan. 10, the armed forces ministry said in a Jan. 14 statement.

Bernard Pêcheur, a senior civil servant, chairs the committee, which comprises 18 members, including lawyers, professors, researchers, doctors, engineers, scientists and historians.

The French and German Airbus units needed to conduct the ethical debate on a transnational basis, a researcher said.

“This has to be done together,” said Jean-Pierre Maulny, deputy director of Institut des Relations Internationales et Stratégique, a think tank.

“If not, there will be incomprehension.”

That cross-border dialog on ethics and arms looked likely to be difficult, in view of the difference in French and German culture.

France was already late in AI and needed to catch up, he said.

The need for dialog with Germany may complicate that French bid to speed up on the AI dossier.

Airbus is partner with Dassault Aviation on the FCAS.

The featured graphic came from the following source:

How AI Could Change The Art Of War

Editor’s Note: Brigadier General Gerald Funke, Dipl.-Ing., German Federal Ministry of Defence Head of Section Plg I and FCAS Representative provided his view on the challenge as follows:

FCAS – a term that is becoming increasingly prominent in the vocabulary of Europe’s security-policy community. For us, it stands for an airborne system comprising manned and unmanned components (the next-generation weapon system) as well as its integration in an overarching system that extends far beyond the airborne systems: the Future Combat Air System (FCAS), currently taking ever more concrete shape through French-German-Spanish cooperation.

However – and this is where one of the challenges in the present discussion begins – it is not yet described concretely enough, which leaves a lot of scope for diverging ideas as to its technical conceptualisation.

Particularly in view of ever more rapid cycles of technical innovation, it is no great leap to predict a development path that will reveal numerous opportunities and risks, with aspects that we are not even able to wholly comprehend at the present time.

By the time the system goes into service, starting from 2040, these aspects will acquire additional facets and weightings.

However – and the volatile dynamic of innovations will certainly not make this any easier – entirely justified questions are already being voiced that go beyond the purely technical aspects of the future system to also raise ethical, moral and legal considerations.

Particularly potential applications of artificial intelligence (AI) open up a number of technical possibilities that we are only now beginning to understand.

And of course, the German Federal Ministry of Defence as one of the customers for the FCAS is highly interested in receiving a system that will embody the latest state of the art when it enters into service starting in 2040.

Based on our current concepts, we are convinced that the use of AI opens up opportunities that can contribute to better protection of non-combatants as well as our own troops, while being able to prepare decisions for our own actions more responsively and promptly, on the basis of significantly broader data.

AI applications will be crucial when it comes to making use of greater quantities and a greater diversity of data from as many different sources and sensors as possible. Using, appropriately connecting and analysing a wide range of different data can make action recommendations available at all levels of military command, enabling more appropriate, more comprehensive and wiser decisions to be made. A human, the military commander, will still retain responsibility for the decision.

Although it may seem superfluous in light of the foregoing, I still wish to expressly emphasise that we will not accept any technical concept that would give any system the possibility to authorise the death of another person solely on the basis of the logic of an algorithm. Human beings will remain the sole determinants, responsible for decisions and all their consequences!

As the representative of the German Federal Ministry of Defence for FCAS, I am extremely grateful to Airbus Defence and Space GmbH as one of the German prime contractors for the future system, and to the Fraunhofer Institute for Communication, Information Processing and Ergonomics (FKIE), for proactively initiating a broad discourse extending beyond the bounds of the security policy community at a very early stage, and for involving the Ministry of Defence in the discussion from the start.

The Ministry of Defence is contributing its position to the discussion on a basis of equality with all other participants. The fact that even the first event in September 2019 was successful in stimulating a broad public discussion, in which a wide range of civil-society groups were persuaded to participate – with a correspondingly broad diversity of opinions – makes me extremely optimistic about this process.

My actions will be oriented toward ensuring systematic further development of our capabilities within the scope defined by the responsible political actors. I am certain that the parliamentary oversight to which we are permanently subject, not least because every major financial step requires the prior approval of the respective legislative bodies, will also follow the development of the public discussion of the fundamental legal, ethical and value-related questions extremely closely.

In this sense, I hope that the discussion we have started can be expanded to include parties that represent the broadest possible range of relevant civil-society opinions, and is not hindered by reservations or aversions. We have certainly made a promising start!

It is important for us that all voices be heard, that discussions can be serious and respectful, and that a willingness to examine the other parties’ arguments in good faith will prevail. This can promote the sharing of bare facts, and thus encourage more objective debate, and create understanding as well as the possibility of participants – on all sides – reconsidering their own positions.

The Ministry of Defence is prepared to engage in this process without reservation, and I am eagerly looking forward to further discussions.

http://www.fcas-forum.eu/en/foreword

Also, see the following:

FCAS: Working Responsible Use of New Technologies