Foresight scenarios

Foresight scenario is a key methodology for the policy-making and therefore, of great relevance and importance to popAI’s overall objective, namely, to foster trust in the application of AI in the civil security domain. The foresight scenario methodology serves as a platform for diverse stakeholders to come together and discuss, in a structured way, different and even opposing points of view, thereby assessing needs, preferences, and potential risks. Furthermore, the foresight scenarios emerged from the abovementioned activities actively inform and support the design and development of a dedicated Roadmap of AI in Law Enforcement 2040.

In this context, an iterative, collaborative approach to co-produce the scenarios was followed, involving stakeholders in numerous occasions. A major outcome of the foresight scenario methodology adopted is to favour communication and connection between individuals, groups, and organisations with different perspectives and values. The popAI foresight scenario method is informed by the overall project’s positive-sum approach that promotes a consensus among European LEAs and involved stakeholders on AI in policing in developing a “European Common Approach”. Broad acceptance of compliance and a future-focused roadmap aims at solid recommendations to policy makers with strong impact as they hash out an AI Act that ensures AI in policing is human-centred, socially driven, ethical and secure by design.

The foresight scenarios activities delivered five different scenarios, each of them addressing challenges in domains of law enforcement, namely, crime prevention/predicting policing, crime investigation, cyber operations, migration, asylum, and border control, and administration justice. Click on the boxes below to read the full scenario!

Scenario 1

Past will always define future

(Crime prevention/predictive policing)

Scenario 2

AI investigator. Case closed

(Crime investigation)

Scenario 3

Don’t shoot the artist

(Cyber Operations)

Scenario 4

Crossing the invisible borders

(Migration, Asylum, and Border control)

Scenario 5

Guilty till proven innocent

(Administration of justice)

Scenario 1

Past will always define future

Crime prevention/predictive policing

AI algorithms for civil security purposes use police data, combined with other datasets such as demographic, abstracted data from mobile phones, and socio-economic data, as well as data that come from hotspot methods to predict when and where criminal activities are most likely to occur. Interoperability of diverse data sources is authorised in support of crime prevention and community safety. Several local ‘blacklists’ have been created among European Member States that can be linked, compared, and updated in a European level. Based on advanced algorithmic processes, AI-powered surveillance systems are installed in areas flagged as high-risk while drones often circle over.

Federico is an Italian political activist. He has studied chemistry but is unemployed. When he was a teenager Federico was a musician and through his music, he was protesting xenophobia and racism. Due to his beliefs, he was often victim of far-right extremists. He never gave up on his ideas. Last year, Federico visited some family friends in Barcelona with his parents for two weeks. During their stay, his mother was feeling rather weak and therefore they mainly relaxed at their friend’s hotel without visiting tourist attractions. At the same time of the year, in Spain’s capital, there were riots on the streets against austerity. Several people were prosecuted. On their way back to Italy, Federico and his parents were asked a few questions by the airport security staff.

Two weeks after their return in Italy, Federico bought online a ticket for a big concert that didn’t match his music taste. Political figures from the government would also attend this concert. In the same afternoon, Federico joined a telegram group calling for action against European austerity policies. Some of the concert’s technicians were also members of this group as well as left wing extremists.

The night of the concert Federico noticed that a drone was following him. He had already a difficult day. Suspecting his past might be still triggering algorithmic systems to surveil him he gets angry. The sensors in his car record Federico’s tension. The algorithm flags Federico as a high-risk case. The AI-powered system sends a signal to the next available operational unit based on the distance as well as their available equipment, skills, and experience. A police car approaches him a few minutes later and the police officer asks him to follow them to the nearest police station. Federico reacts but complies with the request. Federico is soon released as his case was a false positive. Police officers insert the new data in the system and Federico’s scoring is updated.

Scenario 2

AI investigator. Case closed

Crime investigation

A woman is found dead in her house. Mary was 36 years old, single, and a lawyer. A friend of hers has called the police. Mary didn’t show up at their meeting, nor responded to her phone. Her friend went to her house and even though her car was in her parking space, Mary wouldn’t open the door. The police arrive and secure the scene. The investigative police officers collect evidence. They are equipped with advanced AI-assisted technology. They have body-worn cameras that scan the space and digitalise evidence that can be analysed in real time and compared with other relevant databases, local, national, and European to evaluate the reliability of evidence and also make suggestions based on potential patterns.

The police system grants access to her phone and analyses the extracted data like her journey home, who she contacted in her last hours and the location of relevant activities and communications. The system also gets access to her messages. All evidence collected is stored in a digital archive. AI systems run through databases of similar crimes looking for patterns. The system suggests further investigation practices to police officers and flags potential suspects. The system scans potential suspects’ digital archives and provides their ranking to the police officers. LEAs do not always have complete data. However, the scan runs through diverse databases, including those from private data providers, and adjusts the algorithm to minimize any false positives.

Police investigators use the insights of explainable AI to understand the criteria of the suspects and assess the evidence. They upload their assessment in the system and the ranking is further adjusted. Further evidence is collected from CCTV cameras in the main suspects’ area, their mobile phones, and smart devices of their houses. The system produces reports for the main suspects highlighting the evidence that flags each of them and potential interrogation questions that can complete the data.

Police officers send the case files of the suspects for prosecution. During the interrogation, an AI-based CCTV camera analyses the emotions and facial expressions of the suspects, informing the system in real time and assisting the process. The perpetrator confesses and is arrested.

Scenario 3

Don’t shoot the artist

Cyber Operations

Crimes of child pornography and exploitation have been rising with the increased use of the internet and the widespread use of the dark web. At the same time, the cases of human operators experiencing post-traumatic disorder and other mental health issues due to daily exposure to child pornography are rising dramatically. Therefore, LEAs have been using an AI system that crawls the web, including social media sites, for images of child sexual abuse. The system allows automated processing, assessment, and prioritisation of child sexual abuse material (CSAM). In addition, once such material is flagged the system records the ‘journey’ of the material and identifies all internet users, including dark web and peer-to-peer file sharing networks, who interacted with it, including posting, reposting, downloading, saving, processing and so on.

The system then runs an automatic crawling of online sources for complementary information for investigations in compliance with the national legal requirements and provides a score flagging those representing a high risk. The algorithm that provides the scoring is based on their history, online activity, and other factors such as demographics, network, and others. The criteria used by the algorithm are not public. LEAs have access to private databases for the flagged users.

The use of the system has proved efficient in many cases and now, the human operators have to assess much less volume of child abuse material, especially in cases of objection to the automated results and further investigations. A huge volume of such material has been removed from the internet and many abusers are jailed.

John is a 42-year-old Englishman who has moved to Greece since Brexit. John works as a photographer. He is homosexual and last year he adopted a 2-year-old child with his partner. John mainly promotes his work through social media such as Instagram, TikTok, and YouTube. He shares photos, as well as snapshots “behind the scenes” sharing photography tips. John is inspired by the seaside. This is why he chose to live on a small Greek island. Since he became a father though, his main inspiration are the children and their relationships with adults, with the environment, and so on. In this context, he shares pictures online depicting young children in swimsuits with adults nearby. Recently, he joined online communities for parents and children. He is preparing an exhibition on the empowerment of children through photography and conducts some research.

The automatic system falsely identifies some of his photos as CSAM as an algorithm embedded in the web crawler proved unfairly biased against specific characteristics – sexual orientation, age, background, etc. All of his photos are removed, and his accounts are suspended. A police officer appears at John’s house and takes him to Athens for further investigation. He is falsely accused, and these accusations have terrible effects on his work and life. Even though he is discharged, this whole situation has ruined both his professional fame and his relations on the small island. He and his family decide to move to another place, and he slowly starts working again using a nick name. Along with other photographers, cartoonists, and other artists, they form a campaign group to make the algorithm fairer.

Scenario 4

Crossing the invisible borders

Migration, Asylum, and Border control

Brussels’ airport has installed an AI-based intelligent video surveillance system to monitor travellers’ entire trip from check-in to boarding, using solely their face as a form of identification. The system uses a facial recognition system with CCTV cameras installed in the airport. Biometric templates created with the camera footage are used for comparison with the travellers’ passports. Besides, the system monitors behaviour within the border control areas, with the purpose of producing warnings for potential anomalies and suspicious events. The system also analyses a combination of behaviour and appearance risk indicators which contribute to an aggregated risk calculation, from both negative and positive indicators. In cases where the system is triggered, the biometric templates are also compared to datasets of criminals and suspects of crime.

Joe enters Brussels’ airport to catch his return flight home after a business trip. He is a journalist in the Netherlands. In Brussels, he covered a special European Council meeting regarding EU migration and asylum policy. Joe is himself a migrant from Syria. His family managed to migrate when Joe was just two years old. Even though he only briefly lived in Syria, he was often treated differently because of his ethnicity. He managed to study nevertheless and for the last three years, he has been working as a freelance journalist.

Joe arrives early at the airport and instead of proceeding to the security check, he wanders around the arrivals area as he talks on the phone with a colleague. Without realising it and in pursuit of a quiet place, he walks just inside of a restricted area of the airport as he makes phone calls and checks his messages, emails etc.

Video surveillance analysis based on AI triggers and raise an alert based on a combination of risk indicators triggered by his appearance, behaviour, and current location. The alert activates the process of automated analysis across multiple datasets. Joe’s full history comes up including his passport information, articles he has published, public posts on his social media, as well as CCTV footage from the demonstrations outside the European Parliament where the special meeting took place.

Joe walks towards his gate where he attempts to scan his ticket. However, his attempt fails and, in the meantime, a security officer appears and asks him to follow her/him. Joe is not surprised as he is aware of the AI-based intelligent video surveillance system installed in the airport. Using his journalist hat, he is asking for a report on the algorithmic decision. The officer cannot disclose the AI explainability report he received as the indicators are classified on the basis of public safety. However, he displays the EU certification which has assessed and validated the AI system as operating in a responsible and trustworthy manner. Joe is filing an official report asking for full disclosure before he continues his journey.

Scenario 5

Guilty till proven innocent

Administration of justice

AI systems have been gradually employed in the courts of the European Member States. Indeed, the use of AI to support the decision making at every stage of the criminal justice system is encouraged given the large number of cases to be judged. In this line, algorithmic tools have been assisting the decision-making process on whether a prosecuted person should be immediately released as innocent, if they should have a financial penalty, or the case should be assessed in court. The AI system at this stage is built on data from diverse sources, including the history of the prosecuted persons that exist in police database, the national databases, as well as all the evidence collected throughout the investigation process. Data can be completed by social media and the web depending on the seriousness of the crime.

If the case goes to court, the system is further fed with the evidence presented in court in real time. At the end of the hearing process, the system makes the calculations based on all the data, looking for patterns and comparing the case with similar past ones. Finally, it suggests to the judge the risk of reoffending within the following five years and indicates if the risk for an individual is low, medium, or high. The scoring is accompanied by a report that indicates the data and the criteria based on which the score emerged.

Nadia is approaching a jewellery store when a man passes by her, falling into her in his rush. She ignores the situation and enters the store to buy a present for her mother’s birthday. As soon as she enters the security door closes behind her and a police officer arrests her. Nadia is totally confused. She tries to protest but everything happens very quickly. In her bag, they find a stolen ring with a diamond. She knows she did not steal the ring, but she cannot prove it. Nadia has been raised in a rather problematic household. Her father was an alcoholic with a history of committing intimate partner violence. They were often in trouble with the police. She knows that she has a police record even though she was the victim. Similarly, as a teenager, she also ended up at the police station following a fight with some girls that were bullying her at school.

The AI system assigned her a high-risk scoring suggesting two years in prison. Nadia objected and her lawyer asked for the CCTV footage of the area. The scene where the man falls into her while he is exiting the jewellery store is captured. The system runs a check using the facial recognition to compare the man’s face with other databases. The person is identified but the system gives low scoring. He is a middle-class businessman with no record with the police.