Subproject "Software Platform for Shared AI"

A software platform for devloping and testing shared AI is envisoned that goes beyond standard federated learning by taking transfer learning into account:

  • Development Platform: to support the configuration, instantiation and orchestration of pipelines for training transfer learning and confidentiality preservation algorithms;
  • Test Environment: for modeling threat and attack scenarios to evaluate privacy and confidentiality protection and identify vulnerabilities;

Success Story: Tool Ecosystem for Secure and Robust Collaborative AI

Collaborative AI between distributed parties relies on the transmission and sharing of data, parameters, and model fragments to improve the performance of an aggregated model compared to a single model. However, such a setting poses vulnerabilities that can be exploited by adversarial users. To cope with related security threats we have been developing a tool ecosystem consisting of:

  1. Design methodology for hyper-parameter tuning in domain adaptation setup to minimize performance loss
  2. Methodology for semi-supervised ensemble learning to enhance malware detection accuracy and privacy at the same time
  3. Analysis tool addressing poisoning attack effects by utilizing backdoor learning curves
  4. Information-theoretic analysis and design tool to tune the tradeoff between privacy leakage and data utility

These approaches have brought about follow-up research projects in industry 4.0 as well as German KI-SIGS project with Uni Lübeck as coordinator; KI-SIGS is dedicated to the development of an "AI Space for Intelligent Health Systems" in collaboration of northern German AI institutes in Bremen, Hamburg and Schleswig-Holstein together with medical technology companies and partners of the university hospitals.