The matter before Johannesburg Regional Court Magistrate Arvin Chaitram was an attempt by a woman to sue the body corporate of the complex she lived in. However, things backfired when her lawyers relied on the artificial intelligence programme to find case law in her favour, which they then shared with their opponents.
The magistrate, in his ruling, said: “The names and citations are fictitious, the facts are fictitious, the decisions are fictitious.” He cautioned lawyers to use proper legal research – “good old fashioned independent reading”, rather than modern technology.
In the matter before the magistrate, Michelle Parker was intent on suing the trustees of the body corporate for making defamatory statements about her in an email. She was claiming R600 000 in damages.
A key issue which arose during the course of the hearing was whether or not a body corporate could be sued. Parker’s lawyer suggested that there was authority in the form of case law in terms of which the question had been decided and said there were “several authorities”, but he had not been able to access them prior to the hearing.
Lawyers for the body corporate indicated that they were not aware of any such authority.
“As the question appeared to be a novel one that could be dispositive of the entire action, the court requested that both parties make a concerted effort to source these authorities,” Magistrate Chaitram said.
The matter was adjourned for this purpose. During that time, there was correspondence between the parties. At some point, Parker’s attorneys forwarded a list of eight cases to the body corporate’s attorneys – each apparently showing that a body corporate can be sued for defamation and can, in turn, sue for defamation.
However, the body corporate’s lawyers could not access any of these cases and Parker’s lawyers could not furnish them with copies.
When the trial resumed, Parker’s advocate conceded that they had not found any cases. He explained that his attorney had sourced the cases, sent to the opposition, through ChatGPT.
“The attorneys used this medium to conduct legal research and accepted the results that it generated without satisfying themselves as to the accuracy,” the magistrate said. “As it turned out, the cases do not exist.
“Courts expect lawyers to bring a legally independent and questioning mind to bear, especially in novel legal matters, and certainly not to merely repeat in parrot-fashion the unverified research of a chatbot.”
He ordered Parker to pay punitive costs for the adjournment, saying it was “simply appropriate”, although the embarrassment associated with the incident was probably sufficient punishment for her attorneys.
It was reported in May this year, that a lawyer in America had used ChatGPT – on the advice of his teenage children – to prepare for a personal injury case which resulted in him presenting fake cases to the court. That law firm was fined $5 000 after the court found that the lawyers’ court brief included false citations which had been made up by ChatGPT.
To join Africa Legal's mailing list please click here