AI is already being tested in Malaysian courts, but that’s why it might not be a good idea yet

0

In January it was reported that Chief Justice of Sabah and Sarawak Tan Sri Abang Iskandar Abang Hashim said that the courts of Sabah and Sarawak have implemented artificial intelligence (AI) as part of a pilot project. And as AI is currently being tested and used in court, some fear there has been “no proper consultation on the use of the technology”, and that it is “not contemplated in the criminal code. from the country”.

“Our Code of Criminal Procedure does not provide for the use of AI in the courts…I think it is unconstitutional,” lawyer Hamid Ismail said.

Hamid said he was “uncomfortable” that the technology was used before lawyers, judges and the public fully understood it. The artificial intelligence tool, which was being tested in a national pilot project, was used when a man he was defending was convicted.

The driven AI software was developed by Sarawak Information Systems, a state government enterprise. According to authorities, AI-based systems make convictions “more consistent and can eliminate case backlogs quickly and inexpensively.” It also “helps all parties to legal proceedings to avoid long, costly and stressful litigation”.

However, critics have warned of the risks of AI prejudice against minorities and marginalized groups, claiming that technology lacks a judge’s ability to assess individual circumstances or adapt to changing social mores. For example, cases of AI favoring men over women or penalizing members of ethnic minorities have been reported.

This is not a new finding, as newly developed AI can often show bias quite quickly. Like when a low-res photo of Barack Obama was uploaded into an AI face depixelizer, and the result is a white man.

“During sentencing, judges not only consider the facts of the case, they also consider mitigating factors and use their discretion. But the AI ​​cannot exercise discretion,” Ismail noted.

To try to prevent its AI software from making biased judgments, Sarawak Information Systems said it removed the “race” variable from the algorithm. He also noted that the company only used a five-year dataset from 2014 to 2019 to train the algorithm, “which seems somewhat limited compared to the large databases used in global efforts.” . However, there is no information on whether he has since expanded his database.

Analysis by the Khazanah Research Institute showed that court judges followed the AI’s sentencing recommendation in a third of cases, all of which involved rape or drug possession under the terms of the pilot project of both. states. For the other two-thirds, some judges reduced the suggested sentences, while others toughened the sentences on the grounds that they would not have a “strong enough deterrent effect”.

“Many decisions could properly be left to machines. (But) a judge should not outsource discretion to an opaque algorithm,” said Simon Chesterman, Principal at AI Singapore.

[ SOURCE, IMAGE SOURCE ]

Share.

Comments are closed.