Quantcast

St. Louis attorney cautions lawyers about using AI for research, but says it could still be used effectively

MADISON - ST. CLAIR RECORD

Thursday, November 28, 2024

St. Louis attorney cautions lawyers about using AI for research, but says it could still be used effectively

Asbestos
Webp paigetungate

Tungate | Downey Law Group

ST. LOUIS – Lawyers who trusted Chat GPT for artificial intelligence cited fake cases and a judge fined them $5,000, Paige Tungate of Downey Law Group reported to asbestos lawyers at HarrisMartin's annual Midwest Asbestos Litigation Conference on Sept. 20.

“Never assume AI is accurate,” Tungate said.

“Don’t trust everything it gives you.”

“It gives the answer you want.”

Tungate told lawyers in attendance at the conference that they’re liable for artificial research like they are for research completed by non-lawyers.

She said it’s only as good as the lawyer using it.

She made the group laugh by displaying text under a heading of hallucination, saying they thought hallucinations were human experiences.

Tungate explained that someone asked the AI for the world record in crossing the English Channel on foot, and Chat GPT provided a name, time, and date.

All three facts would correctly answer a query for swimming the channel.

She said Chat GPT invents facts to overcome uncertainty.

“It fabricates information, but it behaves as if it was spouting facts,” she said.

Tungate told the attorneys that for ethical purposes, they should keep abreast of relevant technology.

She told them not to worry that artificial intelligence will put them out of business but to worry that lawyers who use it effectively will put them out of business.

She said firms using artificial intelligence could spend fewer hours on research.

She added that it wouldn’t be fair to charge the time they would have spent without it.

Tungate said the concerns are accuracy, security, client expectations, competition, and transparency.

She said clients might ask them why their advice is different from Chat GPT.

In the case of the $5,000 fine, Peter LoDuca of New York City represented Roberto Mata on the U.S. district court docket and colleague Steven Schwartz represented Mata off the docket.

Mata sued Avianca airline last year, claiming a metal cart injured his knee on a flight from El Salvador to New York.

Avianca moved to dismiss, claiming international law applied and the international statute of limitations ran out.

LoDuca filed a response full of citations, and Avianca counsel Bartholomew Banino of New York City couldn’t locate them.

District Judge Kevin Castel ordered LoDuca to file the cases, and LoDuca filed eight in April.

Banino advised Castel he still couldn’t locate them, and Castel ordered an explanation in May.

Schwartz admitted failure and took the blame by affidavit, stating LoDuca had no role in the research nor knowledge of how it was conducted.

At a hearing in June he said, “I heard about this new site which I assumed, I falsely assumed, was like a super search engine called Chat GPT.”

He said he asked it about the international statute of limitations, “and then I asked it to provide case law, and it did.”

Castel said, “You were asking them to produce cases that support the proposition you wanted to argue, right?”

Schwartz said that was right, and Castel said, “The computer complied. It provided you with a case. It wrote a case. It followed your command.”

Schwartz said, “I just never could imagine that Chat GPT would produce fabricated cases.”

He said he assumed it searched sources he didn’t have access to.

He said he was embarrassed, humiliated, and extremely remorseful.

Martin Harris Publishing presented the conference at Four Seasons Hotel in St. Louis.

ORGANIZATIONS IN THIS STORY

More News