The formatted article with media is available on our Substack. We talk about “the justice system” like it’s marble columns and human judgment. But justice mostly runs on text code. Laws - a set of rules. Courts - interpreting and applying those rules. And the real output: court decisions. Judicial system produces text codebase. And like any codebase, it carries bugs. Both cosmetic and fatal. Most remain invisible because no one scans the whole system. AI as Justice Scanner We asked AI to read 222 verdicts under Article 301 - Ukraine’s criminal provision on adult content. Just to check citations. Not to argue. The Results: – 27% of verdicts reference a 1923 anti-pornography convention Ukraine has never signed. – 10% rely on a law repealed in 2023 — sometimes years after its repeal. – Some cite the Law on Media even when no media or platforms were involved. (we haven't managed to extract reliable % with AI yet) This shows that courts run on stale templates. Legal copy-paste that outlives the law itself. Judges and clerks, drowning in caseloads, reuse old text without checking if the law still exists. The Scale Problem A lawyer can notice one bad citation. Maybe a dozen. But to review 222 verdicts — at ~1 hour each — is 300 hours of work. Courts generate hundreds of thousands of such documents. Unchecked patterns hide in plain sight for decades. AI Changes the Equation We built a simple pipeline: feed verdicts into a model, extract references, flag anomalies. AI Cost: under $7. Time: hours, not months. Accuracy: high enough. Why It Matters When we let outdated or irrelevant norms circulate unchecked, we weaken the very fabric of justice. The Environment Is Editable We published the results as a public dashboard. Anyone can click, read the original verdict, see the pattern. We also built: Telegram channels that summarize new verdicts daily with AI An interactive AI chat that answers: “What’s the pattern in Kyiv? What’s the median sentence?” All this runs on open data. All this was built without donor money, mostly off-hours. The method can scale to any area or geography. System-level visibility in weeks, not years. Result of analysis (dashboard and methodology) The Choice Justice is part of our policy environment. If we don’t maintain it, bugs accumulate until someone exploits them. We now have the tools to audit the codebase in real time. The question is whether enough of us will run the scan. P.S. When AI models decide what topics we can or cannot research - as happened when Google AI refused to process our "prngraphy" dataset - this creates a new challenge: censorship by AI, even in justice analysis.
Tags
Justice
Date
14/08/2025
Type
Article