Artificial Intelligence, the Future Combat Air System and Shaping a Way Ahead
Paris – Cultural differences between France and Germany may pose problems over Airbus writing ethical concerns into artificial intelligence for the planned Future Combat Air System, a virtual introduction to a panel of experts heard May 14, 2020.
Airbus in Germany has set up a panel to consider ethical, legal and social aspects of building AI and automation into a planned FCAS, combining a new generation fighter, unmanned aircraft, and existing fighter jets in a complex network dubbed air combat cloud.
That panel was formed last year with Fraunhofer, a research institute, drawing on staff from the German defense and foreign ministries, institutes, universities, and think tanks. Those on the panel are not paid for their work.
The virtual presentation introduced panel members, who spoke of some of the issues to be examined, which included the cultural difference between France and Germany on armaments and AI.
“In many ways, FCAS represents a great leap forward,” Dirk Hoke, chief executive of Airbus Defence and Space, said in a May 14 statement.
“But there are also ethical and legal challenges which we have to address.”
Reimund Neugebauer, president of Fraunhofer, said FCAS would be the first time Germany would see “technical implementation of basic ethical and legal principles — ethical and legal compliance by design.”
Ethics were important but companies needed to win exports to stay competitive, Hoke said.
Ethics and moral standards were part of Airbus’s decision making, he said on the FCAS Forum website set up by Airbus and Franuhofer to address those issues.
Equipment was sold to European armed forces to ensure security, and programs such as FCAS needed economies of scale to pay for high level effort and expenditure, he said.
“In other words, it must be possible to export the system, otherwise it will be incredibly difficult for European business to keep up with global competition,” he said. “This means all those involved need to be aware of the requirements and objectives from the outset and to examine each relevant issue as thoroughly as possible.
“After all, there’s a lot at stake,” he said.
Those who logged onto the May 14 presentation of the panel heard that the aim was to hold discussions with the greatest transparency, with the FCAS Forum website to air the debate.
Arms programs sparked tension in the German parliament and the general public, said brigadier general Gerald Funke of the defense ministry, head of section planning and FCAS project leader.
It was important for civil society to have an ethics debate, in the context of the crises in health, budget, security threat, and climate change, said Ellen Uesberschär, who holds a doctorate in theology and specializes in ethics, digitalization and transformation of society.
“The deepest humiliation is to be killed by a machine,” she said.
“We have to go deeper into the question of risk, rights and responsibilities.”
There was need to ensure sufficient human control in weapons, said Bohn Rudiger, a senior diplomat.
There was an “excellent opportunity” for a European contribution on industry standards on AI and weapons, he said. There was debate in the European Union on liability on AI in civilian applications such as driving and liability in car crashes.
There was need to reconcile differences among the European partners, said Ulrike Franke, policy fellow at the European Council for Foreign Relations, a think tank. France, Germany and Spain were partners on FCAS, and other European nations may join the project.
There were “pronounced divergences” between France and Germany, with the former more open to on AI and autonomy, while the latter was more cautious, she said. The panel could explore the issues, which spanned compromise and setting red lines.
Transparency toward the public was a key factor and public opinion needed to be part of process, she said.
It was crucial to have a human as a “circuit breaker,” as there was real risk of escalation when working at machine speed, said Frank Sauer, a specialist in arms control and military use of AI. It was important to uphold the political will to involve humans, and incorporate that principle at the technical level in weapons, he said. It was important for Europe to get it right.
For Airbus, the duty was to “turn the technical vision into reality,” said Thomas Grohs, Airbus D&S chief architect for FCAS.
Funke said, “It is up to us” to design the red line, decide the role of human in the loop and on the loop, how to make sure humans have meaningful control.
Grohs said rules of engagement and concept of operations would serve as “tool sets,” reflecting legal and safety requirements. There were few ethical requirements for now and the forum would establish the requirement list.
In AI and neural networks, there could be a flexible, modular approach offering different modules to be loaded, reflecting ethical rules of the various users, he said.
Grohs was replying to a question on the engineering point of view from Wolfgang Koch, computer science professor at Bonn university.
There was need for humans in the loop, with humans as circuit breakers, Koch said. There was need for engineering to be “ethically compliant,” with digital systems used in a responsible way, in political and social terms.
Asked on how to regulate on autonomous weapons and deal with nations which do not observe regulation, Bohne said it was complex, but a step-by-step approach could be taken, leading to a regulated autonomy in weapon systems. It was important to take a prescriptive approach to get human control over autonomous systems.
On FCAS and NATO, Funke said the weapon would be “interconnected,” with the planned system designed to work with NATO systems.
Grohe said on the technical design, there was strong desire to work with NATO.
The ethical debate is presently on a national level and will be extended to a transnational discussion, the forum panel heard. The next panel meeting will be held in October.
In France, FCAS comes under the purview of an ethics committee on AI formed by the armed forces ministry, air force general Jean-Paul Breton said on his LinkedIn social media account. Breton heads the French operational study on FCAS.
The ethics committee has been asked to “contribute reflection on ethical issues taking into account the emergence of new technologies in the defense field such as the use of artificial intelligence,” he said. That reflexion will start as a national study.
The first meeting of the ethics committee took place Jan. 10, the armed forces ministry said in a Jan. 14 statement.
Bernard Pêcheur, a senior civil servant, chairs the committee, which comprises 18 members, including lawyers, professors, researchers, doctors, engineers, scientists and historians.
The French and German Airbus units needed to conduct the ethical debate on a transnational basis, a researcher said.
“This has to be done together,” said Jean-Pierre Maulny, deputy director of Institut des Relations Internationales et Stratégique, a think tank.
“If not, there will be incomprehension.”
That cross-border dialog on ethics and arms looked likely to be difficult, in view of the difference in French and German culture.
France was already late in AI and needed to catch up, he said.
The need for dialog with Germany may complicate that French bid to speed up on the AI dossier.
Airbus is partner with Dassault Aviation on the FCAS.
The featured graphic came from the following source: