- Advertisement -
World

New York lawyers sanctioned for using fake ChatGPT cases in legal brief

The judge found the lawyers acted in bad faith and made 'acts of conscious avoidance and false and misleading statements to the court.'

Reuters
2 minute read
Share
The logo of OpenAI is displayed near a response by its AI chatbot ChatGPT on its website, in this illustration picture taken Feb 9. Photo: Reuters
The logo of OpenAI is displayed near a response by its AI chatbot ChatGPT on its website, in this illustration picture taken Feb 9. Photo: Reuters

A US judge on Thursday imposed sanctions on two New York lawyers who submitted a legal brief that included six fictitious case citations generated by an artificial intelligence chatbot, ChatGPT.

US District Judge P Kevin Castel in Manhattan ordered lawyers Steven Schwartz, Peter LoDuca and their law firm Levidow, Levidow & Oberman to pay a US$5,000 (about RM23,300) fine in total.

The judge found the lawyers acted in bad faith and made "acts of conscious avoidance and false and misleading statements to the court."

Levidow, Levidow & Oberman said in a statement on Thursday that its lawyers "respectfully" disagreed with the court that they acted in bad faith.

"We made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth," the firm's statement said.

Lawyers for Schwartz said he declined to comment. LoDuca did not immediately reply to a request for comment, and his lawyer said they are reviewing the decision.

Schwartz admitted in May that he had used ChatGPT to help research the brief in a client's personal injury case against Colombian airline Avianca and unknowingly included the false citations. LoDuca's name was the only one on the brief that Schwartz prepared.

Lawyers for Avianca first alerted the court in March that they could not locate some cases cited in the brief.

Bart Banino, a lawyer for Avianca, said on Thursday that irrespective of the lawyers' use of ChatGPT, the court reached the "right conclusion" by dismissing the personal injury case. The judge in a separate order granted Avianca's motion to dismiss the case because it was filed too late.

The judge wrote in Thursday's sanctions order that there is nothing "inherently improper" in lawyers using AI "for assistance," but he said lawyer ethics rules "impose a gatekeeping role on attorneys to ensure the accuracy of their filings."

The judge also said that the lawyers "continued to stand by the fake opinions" after the court and the airline questioned whether they existed. His order also said the lawyers must notify the judges, all of them real, who were identified as authors of the fake cases of the sanction.