Policy Lab 2
15th September 2022 – Munich, Germany
Following up on the first popAI Policy Lab (which was held in Greek), the second popAI Policy Lab was conducted online in German on Thursday 15 th September 2022 by the University of Applied Science – Police Affairs (HfoeD) in Bayern, Germany. Twelve participants attended the event, which was held in the form of focus groups from EU Member States, Switzerland and Jordan with variegated expertise and background.
The German Policy Lab was structured over two use cases: the first concerned AI in support of mission control, while the second dealt with child sexual abuse material.
Participants worked in groups to facilitate discussion and reconvened later for a wrap-up session bringing together the focus groups and some general comments. Each focus group included participants from LEAs, ethics specialists, experts with a technical background and a moderator.
Discussion gravitated around three key points:
- LEA perspective on the cases;
- The technical viewpoint, considering how AI could support LEA tasks in the given area;
- Individual rights that needed major attention.
AI in support of mission control
Command and control centres face significant challenges in the deployment of units for operations: In addition to the location of the units, for example, their available equipment, skills and experience are of central importance for the success of certain operations. An AI-based suggestion mechanism on which units to deploy can help make relevant and pertinent decisions for the best possible management of an operation.
AI should be used to ensure police officer receives timely information about the crime scene
Legal side needs to be taken into account to see to what extent automated queries can be used; clear limits are essential
Traceability must always be given – no black-box approach. Automated processing must always ensure the possibility of human intervention.
LEAs representatives recognised no game-changer role of AI in mission control, although it could be more useful in metropolitan areas.
AI as support for the processing of child sexual abuse material (CSAM)
Processing dozens, even hundreds, of suspected CSAM cases poses challenges to LEAs on a daily basis, both emotionally-psychologically and organisationally. AI-based solutions can help with both case processing and documentation, as well as identifying potentially substantive leads of relevance to investigation.
AI can help identify perpetrators more easily, although it cannot replace the competencies of the investigators
AI systems should not be a black box: process tracking and human intervention must remain possible
It is crucial to educate people. A lot of negative perceptions are due to the way public media report AI issues (e.g. job killer, surveillance state)
In conclusion, it emerged that the second scenario is more likely to see the implementation of AI-based solutions, as AI allows the analysis of a greater amount of data. In both cases, participants agreed that black box approaches should be avoided and recognised the need to ensure human intervention if and where necessary.
The Policy Lab was driven by the active participation and involvement of its participants, who contributed to a fruitful discussion. The output of this second Policy Lab will be later used within the popAI project both for the definition of the foresight scenario and for drafting the policy recommendations.