Judge calls out 'expert witness' for using AI chatbot

This isn't the first time using an AI chatbot to do legal work got someone in trouble.
By Matt Binder  on 
Microsoft Copilot logo
An expert witness in a court case used Microsoft's AI chatbot Copilot to assess damages and was reprimanded by the judge. Credit: CFOTO/Future Publishing via Getty Images

If you find yourself needing an expert witness in a courtroom case, make sure they're not using an AI chatbot for their supposed expertise.

Last week, a New York judge reprimanded an expert witness in a real estate dispute case for using Microsoft's AI chatbot Copilot. 

The expert witness, Charles Ranson, used Copilot in order to generate an assessment for damages that should be awarded to the plaintiff in the case. The case was first reported on by Ars Technica.

Copilot in court – a bad idea

The case at the center of this story involved a dispute over a $485,000 rental property in the Bahamas. The man who owned the real estate had passed away, and the property was included in a trust for the deceased man's son. The deceased man's sister was responsible for executing the trust. However, the sister was being accused of breaching her fiduciary duties by delaying the sale of the property while utilizing the property for her own personal use.

A major part in winning the case for the son was proving that he suffered damages due to his aunt's actions.

Ranson was brought on as an expert witness and tasked with assessing those damages.

While Ranson has a background in trust and estate litigation, according to judge Jonathan Schopf, he had "no relevant real estate expertise." So, Ranson turned to Microsoft's AI chatbot, Copilot.

Ranson apparently revealed his Copilot use in his testimony. When questioned about it, Ranson was unable to recall what prompts he used to assess the damages or what sources Copilot cited to arrive at its estimate. Ranson was also unable to explain how Copilot works.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

The court then decided to use Copilot to see if it could arrive at the same estimate that Ranson provided. The court asked Copilot "Can you calculate the value of $250,000 invested in the Vanguard Balanced Index Fund from December 31, 2004 through January 31, 2021?"

Copilot provided a different answer in three different attempts, and each answer was different from Ranson's own Copilot-generated amount.

The court then asked Copilot if it was a reliable source of information, which Copilot replied by saying that its outputs should always be verified by experts.

According to the judge, Ranson was adamant that AI tools like Copilot were standard use in his industry, however he was unable to cite a single source showing this to be true.

Ranson's AI chatbot use wasn't his only mistake. However, the Copilot situation certainly hit the expert witness' credibility. The judge found that the evidence showed that the delay in the sale of the property not only didn't result in a loss, but additional profit for the son, and ruled there was no breach of fiduciary duty from the aunt.

Not the first time, and probably not the last time

Ranson's use of Copilot as some expert source of information is certainly not the first time AI chatbots have been used in the courtroom.

Readers may recall lawyer Steven Schwartz who last year relied on ChatGPT in legal filings for a case involving an airline customer being injured during a flight. Schwartz was reprimanded after submitting filings which cited completely nonexistent cases. Schwartz had used ChatGPT for his research, and the AI chatbot just made up previous cases, which Schwartz then included in his filings.

As a result, Schwartz and another lawyer at the firm he worked for were fined $5,000 by the court for "acting in bad faith."

The same scenario happened again with another lawyer, Jae Lee, who used ChatGPT in her filings earlier this year. Once again ChatGPT hallucinated cases that did not exist.

In the Bahamas real estate case, Judge Schopf made a point not to blame the AI chatbot but the user for citing it. However, AI chatbots continue to proliferate online and major tech companies like Google and Microsoft are ramping up promotion of this technology to users.


Recommended For You

Amazon's AI chatbot will start serving ads to users
Amazon AI chatbot Rufus

5 prompts to have a fun AI chatbot conversation
An illustration of a cyclist looking at a phone.


Scientists witness stunning, unprecedented carnage in the ocean
An Atlantic cod swimming in the dark ocean.

Trending on Mashable
NYT Connections hints today: Clues, answers for October 31
A phone displaying the New York Times game 'Connections.'

Wordle today: Answer, hints for October 31
a phone displaying Wordle


NYT's The Mini crossword answers for October 31, 2024
Closeup view of crossword puzzle clues

Wordle today: Answer, hints for October 30
a phone displaying Wordle
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!