- By Kathryn Armstrong
- BBC news
A New York lawyer is facing a court hearing of his own after his firm used AI tool ChatGPT for legal research.
A judge said the court was faced with an “unique circumstance” after a submission was found to refer to examples of legal cases that did not exist.
The lawyer who used the tool told the court he was “unaware that the content could be false”.
ChatGPT produces original text on request, but comes with warnings that it may “produce inaccurate information”.
The original case involved a man suing an airline for an alleged personal injury. His legal team filed a brief citing several previous court cases in an attempt to prove, using precedent, why the case should go forward.
But the airline’s lawyers later wrote to the judge to say they could not find any of the cases referenced in the brief.
“Six of the submitted cases appear to be false judicial decisions with false citations and false internal citations,” Judge Castel wrote in an order requiring the man’s legal team to explain.
In the course of several filings, it emerged that the research was not prepared by Peter LoDuca, the attorney for the plaintiff, but by a colleague of his at the same law firm. Steven A Schwartz, who has been a lawyer for more than 30 years, used ChatGPT to look for similar past cases.
In his written statement, Schwartz clarified that LoDuca had not been part of the research and had no knowledge of how it had been conducted.
Schwartz added that he “profoundly regrets” relying on the chatbot, which he said he had never used for legal research before and was “unaware that the content could be false”.
He has vowed never to use AI to “supplement” his legal research in the future “without absolute verification of its authenticity”.
Screenshots attached to the archive appear to show a conversation between Schwarz and ChatGPT.
“Is varghese a real case,” says a message referring to Varghese v. China Southern Airlines Co Ltd, one of the cases no other lawyer could find.
ChatGPT replies that yes it is – prompting “S” to ask, “What’s your source”.
After “double checking”, ChatGPT replies again that the case is real and can be found on legal reference databases such as LexisNexis and Westlaw.
It says the other cases it has given to Schwartz are also real.
Both attorneys, who work for the firm Levidow, Levidow & Oberman, have been ordered to explain why they should not be disciplined at a June 8 hearing.
Millions of people have used ChatGPT since it was launched in November 2022.
It can answer questions in natural, human-like language, and it can also mimic other writing styles. It uses the internet as it was in 2021 as a database.
There have been concerns over the potential risks of artificial intelligence (AI), including the potential spread of misinformation and bias.