"System 206" is a Chinese Artificial Intelligence program that takes the place of a prosecutor and this is not at all like every dystopian movie you've ever seen so stop thinking that right now.
· Jan 1, 2022 · NottheBee.com

Leave it to the enterprising Chinese Communist Party to fully automate oppression.

The name, "System 206" is as chilling and bloodless sounding as it is antiseptically anodyne.

It would also make a great name for a punk band.

Designed by the state to relieve their prosecutors from the relentless burden of punishing thought crimes, System 206 is expected to become self aware at 2:14 AM Eastern Time on August 29th, 1997…

Wait, sorry, lost myself there for a moment.

The machine was "trained" using more than 17,000 cases from 2015 to 2020. So far, it can identify and press charges for Shanghai's eight most common crimes.

They are credit card fraud, running a gambling operation, dangerous driving, intentional injury, obstructing official duties, theft, fraud...

Well, you know, those all seem pretty reasonable crimes to...

...and "picking quarrels and provoking trouble" – a catch-all charge often used to stifle dissent.

A prosecutor in the southern city of Guangzhou said he had some concerns about the use of AI in filing charges.

"The accuracy of 97 per cent may be high from a technological point of view, but there will always be a chance of a mistake," said the prosecutor, who requested not to be named because of the sensitivity of the issue.

Not to mention the sensitivity of never seeing his family again.

"Who will take responsibility when it happens? The prosecutor, the machine or the designer of the algorithm?"

I have a guess!

One clarification, this new AI appears to be a progression of the original "System 206" which actually refers to an earlier AI program.

There are things that could definitely benefit from "streamlining." Getting your In-N-Out order completed on a timely basis. Renewing your driver's license. Assembling furniture you bought from Ikea and are the drawers supposed to be upside down because I don't remember that from the picture.

Justice, on the other hand, maybe we don't want to streamline that too much...

[The System 206] AI system captures the elements of a case in the electronic files, such as time, place, people, behavior and consequences, Ma, who is doing research on the system, told the Global Times on Tuesday.

Besides, the AI system is capable of identifying defective evidence to avoid wrongful convictions, the paper reported.

Some Chinese cities have used machines to monitor government employees' social circles and activities to detect corruption, according to researchers involved.

It's okay.

In the U.S., we're a long way off from the so-called idealized future promised by AI. We're still working on the bugs in forensic algorithms. A good example is the 2017 District of Columbia court case. The case involved an anonymous defendant who nearly experienced the fallout from faulty programming that was presented as evidence in court.

Ah well, it's no sense standing in the way of modern progress. I'm sure it will all turn out just fine.

Just fine.


Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot

You must signup or login to view or post comments on this article.