Search
Browse By Day
Browse By Person
Browse By Session Type
Browse By Research Area
Search Tips
Meeting Home Page
Personal Schedule
Sign In
Since the late 20th century, digitisation has hit policing with full force. From predictive policing methods through recidivism prediction to automated biometric identification at the border, more and more aspects of policing are employing automation.
For example, in 2018, the EU's PNR Directive (681/2016) has made it compulsory for airlines to transmit Passenger Name Records (PNR) of all passengers to police authorities. The result is a very large database (over 12 million entries in Austria alone) which is algorithmically sifted to detect “suspicious” people and behaviour. Due to its predictive and explorative nature, the PNR system constitutes an example for the problematic technique of predictive policing.
Automatically generated suspicion is legally contested, as the new data protection regime included in the Data Protection Directive for Police and Criminal Justice Authorities (Directive 2016/680, Art 11) explicitly restricts automated decisions. The PNR Directive in particular has been claimed to violate fundamental rights and cases against it are being litigated in Austria, Belgium and Germany, two of which are currently being considered by the European Court of Justice. Moreover, automation also is an issue from a workers' rights perspective, since it raises questions of autonomy and personal responsibility for the people ultimately tasked with enforcing the law.
In our contribution, we will consider automation in policing (in particular, the PNR system) from the viewpoint of fundamental rights and data protection law as well as workers' rights. We will analyse the legal situation and address accountability and transparency in predictive policing systems.