A Wake-Up Call for Every Criminal Lawyer
In a moment that has already gone viral internationally, Christchurch District Court Judge Tom Gilbert has put lawyers on notice – if your client’s apology letter reads like it was drafted by ChatGPT – it probably was. And the court is not impressed.
The case involved Michae Ngaire Win, 37, who pleaded guilty to arson, burglary, common assault and resisting police. The arson alone caused more than $500,000 damage to her Christchurch rental property. During sentencing in February 2026, defence counsel Cindy Lee advised the court that Win had written letters of apology – one to the judge and one for the victims.
Judge Gilbert wasn’t convinced. The letters were, in his words, “nicely written” – perhaps a little too nicely written. So he did what any inquisitive judge in 2026 would do and ran the prompt “draft me a letter for a judge expressing remorse for my offending” through two AI tools.
The outputs matched Win’s letters almost verbatim, with only minor tweaks around the edges.
“It is clear these letters were generated by AI,” Judge Gilbert said in Court. “The issue of remorse is interesting… But certainly when one is considering the genuineness of an individual’s remorse, simply producing a computer-generated letter does not really take me anywhere as far as I am concerned.”
Win later admitted using AI to “help” write them. Her lawyer argued that with the advent of technology, people shouldn’t be blamed for using it. The judge was unmoved: using the tool “undermines the sentiments” of the apology.
The result? Win received only a 5% discount for remorse (far less than sought). Starting point of three years and nine months, minus discounts for guilty plea (15%) and personal circumstances (20%), produced a final sentence of 27 months’ imprisonment plus $3,000 reparation. Home detention was declined despite the pre-sentence report.
The story has now been picked up globally – including a prominent feature in The New York Times on 17 February 2026
In the piece, the judge’s transcript is laid bare, and the international press is framing it as a real-world test of whether AI can ever convey authentic human remorse.
Why this matters for lawyers
Remorse remains a statutory mitigating factor under the Sentencing Act 2002. But as Judge Gilbert has now made crystal clear, courts expect genuine remorse – not polished prose that could have been generated in 30 seconds by anyone with a free ChatGPT account.
For criminal lawyers the implications are immediate and practical:
- Disclosure and candour: Do you now have an ethical obligation to disclose AI assistance when submitting client statements or letters to the court? The Rules of Conduct and Client Care don’t explicitly say so yet – but credibility is everything in sentencing. Getting caught out (as happened here) risks exactly the outcome Win received.
- Verification is now part of your job: Many lawyers already ask clients to write remorse letters in their own handwriting. That practice just became essential risk management. Or at the very least, have the client sit with you and explain, in their own words, why the letter reflects their true feelings.
- Client advice: The next time a client asks you to “help with the apology letter”, your standard response should probably include a clear warning: “The judge will be looking for authenticity. AI can help structure thoughts, but it can’t create genuine remorse.”
Judge Gilbert was careful not to demonise the technology itself. He described AI as “a good tool” and called the situation “tricky”. Crown prosecutor Jade Lancaster noted it was the first AI-generated apology letter she had seen but warned it may not be the last.
This is exactly the kind of case that accelerates the conversation New Zealand lawyers have been having in boardrooms and chambers for the past 18 months: how do we responsibly integrate AI into practice without undermining the human elements the courts still demand?
The AI remorse era has officially arrived in New Zealand courts. And Judge Gilbert has just drawn the line in the sand.