New technologies in the judicial system are a “threat to a fair trial”

The House of Lords Justice and Home Affairs Committee said it had been “baffled by the proliferation of artificial intelligence tools potentially being used without proper oversight, particularly by police forces across the country in a concerned report on the use of new technologies in the justice system which also warns that “an individual’s right to a fair trial could be compromised by algorithmically manipulated evidence”.

“Facial recognition may be the best known of these new technologies, but in reality there are many others already in use, and more are being developed all the time. Algorithms are being used to improve crime detection , assist in the security categorization of prisoners, streamline entry clearance processes at our borders, and generate new information that feeds the entire criminal justice pipeline,” the Committee said March 30.

The market, Peers warned, was a ‘Wild West’ saying that ‘public agencies and 43 police forces are free to individually order the tools they like or buy them from companies willing to enter the market. booming AI” but public buyers often “don’t know much” about the systems they are buying because there are “no minimum scientific or ethical standards that an AI tool should respect before it can be used…”

See also: National police computer faces more delays

Among the tools being rolled out across the UK is one that uses data analytics and machine learning to “provide a prediction of the likelihood of an individual committing a violent or non-violent offense over the course of next two years”. (This was described in oral evidence before the committee by Kit Malthouse MP, Minister for Crime, Police and Probation as “‘a modern phenomenon of the policeman knowing who the bad guys were in the community.'”)

As with the use of live facial recognition technologies, ministers have been reluctant to legislate and prefer to let deployments be challenged in court. Existing rules on human rights, data protection, discrimination and public administration make it difficult for law enforcement and their technology partners to get clarity on compliance. NCC Group told Peers that “there are very few laws and regulations overseeing the safe and secure deployment of [Artificial Intelligence] and [Machine Learning]technologies” and the Bar Council agreed: “For some technologies, such as AI, it is not clear that there is an effective legal framework. »

The report concludes that “a stronger legal framework is needed to avoid damaging the rule of law” and recommends ministers introduce primary legislation to prevent this.

(In a written communication, the Met Police called for a “code of practice [that] would provide a framework for ethical decision-making when considering whether to use a new technology. Ideally, to ensure consistency, effective oversight, longevity and predictability, it would focus on technology by types rather than seek to regulate a specific tool or deployment methodology. Areas that would be useful to cover include artificial intelligence, advanced data analytics, sensor data, automation and biometrics, but in a way that could apply to drone and drone applications. ANPR to a personnel database or property management system. The approach would be based on the policing goals pursued, individuals’ expectations of privacy informed by community engagement, alternatives to intrusion, and ways in which effectiveness and demographic performance can be evaluated…”)

New technologies in the justice system are a "threat to a fair trial"
A vicious circle ?

Artificial intelligence in justice: risks for a fair trial?

A contributor also told the Committee that algorithmic technologies could be used without the knowledge of the court and that, in some cases, the evidence could have been “tampered with”. David Spreadborough, a forensic analyst, gave the example that algorithmic error correction could be built into CCTV systems and that “parts of a vehicle, or a person, could be artificially constructed” , without the court being aware of it.

Another contributor suggested to the Committee during oral testimony that the judiciary might feel compelled to give in to algorithmic suggestions, and that this would make judges “the long arm of the algorithm” – Peers noting that “a solid understanding Where advanced technologies may appear, how they work, their weaknesses and how their validity will be determined is therefore an essential guarantee of the right to a fair trial.

Surveillance of this rapidly evolving “Wild West” is expanding where it exists (there are more than 30 government agencies, initiatives and programs playing a role*), the report warns, saying that “the number of entities and of public bodies that have a role in the governance of these technologies indicates duplication and a lack of cohesion”.

With regard to the rapid rise of new technologies in the justice system, “there appears in practice to be a considerable disconnect within government, exemplified by confusing and redundant institutional control arrangements and resulting in a lack of coordination. Recent attempts at harmonisation, on the contrary, have further complicated an already cluttered institutional landscape. A thorough review of all departments is urgently required,” the House of Lords Justice and Home Affairs Committee concluded in his report.

Ultimately, he suggested, “The government should establish a single national body to govern the use of new technologies for law enforcement. The new national body should be independent, established on a statutory basis and have its own budget. The full report is here.

*These include:

  • Inspection of the Gendarmerie and Her Majesty’s Fire and Rescue Services
  • The AI ​​Council
  • The Association of Police and Crime Commissioners (APCC) and its various working groups and initiatives, including the APCC Working Group on Biometrics and Data Ethics
  • The Biometrics and Forensic Ethics Group
  • The commissioner for biometrics and surveillance cameras
  • The Center for Data Ethics and Innovation
  • The Police College
  • The Data Analytics Community of Practice
  • The Equality and Human Rights Commission
  • The Regulator of Forensic Sciences
  • The Home Office Digital, Data and Technology function
  • The Independent Office for Police Conduct
  • The Information Commissioner’s Office
  • The National Crime Agency and its TRACER program
  • The national data analysis solution
  • The National Orientation Group on Digital and Data Ethics
  • The National Digital Exploitation Center
  • The National Council of Chiefs of Police, and its eleven coordinating committees, each responsible for a specific aspect related to new technologies
  • The National Police Ethics Group
  • The chief scientific adviser of the national police
  • The AI ​​Office
  • The Police Digital Service, its Data Office and its Chief Data Officer
  • The Police Rewired Initiative
  • The Police Science, Technology, Analysis and Research (STAR) Fund
  • Police, Science and Technology Investment Board
  • The Royal Statistical Society
  • The Scientific Advisory Council to the Chief Scientific Adviser of the National Police
  • The Senior Data Governance Panel within the Ministry of Justice
  • The specialized and general ethics committees of certain police forces
  • The program against organized exploitation