Unlock the Editor’s Digest at no cost
Roula Khalaf, Editor of the FT, selects her favorite tales on this weekly publication.
Sullivan & Cromwell instructed a US federal chapter court docket {that a} main submitting it made in a high-profile case contained a number of “hallucinations” made by AI software program.
Andrew Dietderich, the top of S&C’s restructuring apply, apologised in a letter to New York federal choose Martin Glenn on Saturday for errors that included misquoting the US chapter code and citing circumstances incorrectly in a court docket submitting made on April 9.
“We deeply remorse that this has occurred,” he stated within the letter.
Dietderich stated the agency’s insurance policies on using AI had not been adopted when the doc was ready, and it was contemplating whether or not it wanted to make “additional enhancements” to its inner coaching and assessment processes.
The letter didn’t say which legal professionals ready the paperwork or whether or not they had been nonetheless on the agency. S&C declined to remark.
The errors are the most recent instance of knowledgeable providers agency grappling with using cutting-edge expertise to hurry up laborious analysis and reduce down on staffing whereas additionally making an attempt to keep up high quality requirements.
The case in query revolves round S&C’s illustration of liquidators appointed by authorized authorities within the British Virgin Islands who’re pursuing actions towards Prince Group and its proprietor Chen Zhi.
US federal prosecutors final 12 months charged Zhi with wire fraud and cash laundering, accusing him of “directing Prince Group’s operation of forced-labour rip-off compounds throughout Cambodia . . . that stole billions of {dollars} from victims in america and around the globe”.
In a separate motion, US prosecutors additionally filed a civil forfeiture criticism looking for to grab practically $9bn value of bitcoin that the US authorities stated represented the proceeds of the Prince Group crimes. Zhi was arrested earlier this 12 months in Cambodia and extradited to China after a request from Beijing.
Prince Group is included within the British Virgin Islands and the Chapter 15 continuing within the US court docket system is designed to get the US authorities to formally recognise the powers of the BVI liquidators to symbolize collectors and victims within the US authorized proceedings, liquidators instructed the court docket.
In a number of cases, S&C within the April 9 submitting erroneously summarised the conclusions made in different circumstances, in accordance with an inventory of strike-through corrections the agency submitted to the choose.
S&C has an enterprise licence for ChatGPT in accordance with a number of folks conversant in the agency’s operations. In keeping with S&C’s web site, at the very least 5 high-level companions have been assigned to the Prince Group chapter case.
The agency’s companions sometimes cost greater than $2,000 per hour in chapter circumstances. The agency earned a number of hundred million {dollars} in charges in representing crypto alternate FTX in its chapter liquidation.
Boies Schiller Flexner, the legislation agency representing Prince and Zhi, noticed the errors in S&C’s submitting. In a doc filed final week, BSF stated phrases that S&C had quoted in its movement “don’t seem in chapter 15 of the US Chapter Code” and pointed to “a number of cited selections” that had been “misquoted or misidentified”.
It stated a case cited by S&C within the movement “just isn’t a case” and the reference was to “a distinct resolution in a distinct circuit”.
S&C instructed the court docket that the agency maintained “rigorous” requirements when utilizing AI instruments and that it “instructs legal professionals to ‘belief nothing and confirm all the things’”. Failure to confirm AI-generated output “constitutes a violation of agency coverage”, it stated.
It’s the newest in a sequence of errors by legislation corporations utilizing AI instruments. Final 12 months, Latham & Watkins admitted that one in all its legal professionals had used Anthropic’s Claude mannequin to assist draft a submitting which contained an apocryphal title and writer for a journal article.
In one other occasion, a federal appeals court docket in New Orleans ordered a $2,500 sanction towards a lawyer who had submitted a quick with 21 errors or fabrications that had been inserted by AI.
Individually, in September John Kucera, then a companion at BSF, stated in a case towards Amazon {that a} doc for which he was accountable, ready utilizing AI instruments, contained “materials quotation errors” as a consequence of his “failure to confirm” particulars. “I’m embarrassed by and really a lot remorse these errors”, he stated within the submitting. BSF didn’t reply to a request for remark.
S&C instructed the choose overseeing the Prince Group case that its doc assessment additionally confirmed “non-substantive and/or clerical errors in different filings on this matter”. The agency stated these errors had been made by people, not AI.




