Increasingly chatbots, search engines and artificial intelligence-driven engagement platforms have been in the news, often for the amazing capabilities they demonstrated, but sometimes also for the dangers they pose. This raises the question of how far lawyers can go in relying on these incredible AI interfaces to help them get answers and enhance their work.
Although AI and AI capability have been around for some time, the ChatGPT phenomenon that hit the world in 2022 has thrown a spotlight on the rapid progression and ‘unworldly’ capabilities of AI systems and how effective they are in assimilating huge amounts of data and giving uncannily accurate answers. So surely the legal profession that deals with volumes of information and resource sifting could also benefit from this enhanced technology?
Like with most new inventions, the development of AI technology has not, however, been without controversy. The relatively easy access to the technology resulted in an increase in cheating and plagiarism allegations against students in leading universities in the UK and the USA.
AI is also one of the reasons for the strike by around 11 500 members of the Writers Guild of America, who, inter alia, demanded assurances that AI would not take their place as screenwriters in the future.
The legal fraternity has also not been immune to the AI controversies. Our media recently reported on a case in the Johannesburg Regional Court where lawyers argued a case using references to fake case law generated by ChatGPT. The case in question dealt with the issue of whether a body corporate could be sued for defamation, with the legal representative for the plaintiff arguing that there were previous judgments that dealt with the legal issue in question. Upon further investigation and in an effort to track down these judgments, the parties found that ChatGPT referred to examples of cases, however, the citations given related to different cases, with some not even relevant to defamation suits. The presiding magistrate noted in his judgment that the legal representatives for the plaintiff, by citing false case law found on ChatGPT “had not intended to mislead the court, they were simultaneously simply overzealous and careless”. A punitive cost order was granted against the plaintiff.
The Gauteng Local Division of the High Court in Johannesburg also recently dealt with a bail appeal where the use of Google by a presiding officer was scrutinised. In the matter of The State v James Junior Aliyu (Case number: A12/2023), Acting Judge P Johnson set aside an order of the Randburg Magistrate Court to grant bail. In this matter, the presiding officer used Google to find evidence of the existence of an extradition treaty between the USA and Nigeria. The High Court found that it is irregular for a court to search for evidence on Google, which had not been proven to be a reliable source of information, to contradict the arguments of one party or the other. By using Google, the magistrate acted as a witness in the case and it is not permissible for an independent judicial officer to give evidence from the bench.
In the USA, lawyers who delivered a court brief with references to false citations from ChatGPT were fined $5000.00 and US District Judge P. Kevin Castel noted that they “abandoned their responsibilities” by using fake citations created by the artificial intelligence tool ChatGPT.
But does this all mean that search engines and AI are off-limits for the legal profession? What does South Africa’s legal profession have to say about this?
In South Africa, the Code of Conduct for all Legal Practitioners, Candidate Legal Practitioners and Juristic Entities, read with s 36(2) of the Legal Practice Act 28 of 2014, highlights the expected standards of conduct of legal practitioners.
The following articles in the Code of Conduct for all Legal Practitioners are highly relevant to the above-mentioned examples:
•
Paragraph 3.1 of Part II of the code requires a legal practitioner to maintain the highest standards of honesty and integrity;
•
Paragraph 3.3 of Part II of the code requires a legal practitioner to treat the interests of their clients as paramount, but that their conduct is subject to the legal practitioner’s duty towards the court, the interest of justice, observing the law, and maintaining ethical standards of the profession;
•
Paragraph 18.15.1 of Part III of the code requires a legal practitioner not to represent anything that they may know to be untrue;
•
Paragraph 57.1 of Part VI of the code states that a legal practitioner shall take all reasonable steps to avoid, directly or indirectly, misleading a court or a tribunal on any matter of fact or question of law.
Although they don’t specifically refer to the use of AI, these standards require a legal practitioner to ensure that they use information that is accurate and reliable. These standards also don’t outlaw the use of AI and technology in performing legal services, and as in other industries, such use is becoming more prevalent and even accepted.
However, when using search engines or AI in preparing legal arguments, a legal practitioner must still undertake the necessary due diligence to confirm that the information provided is indeed correct and reliable. Failing to do so may not only provide embarrassment but result in a contravention of his or her legal duty towards the court and acting in contravention of the Code of Conduct for Legal Practitioners.
As much as technology and even AI may be here to stay, one can nevertheless take Magistrate Arvin Chaitram’s (Johannesburg Regional Court) comment to heart and remember that when it comes to legal research, the efficiency of modern technology still needs to be infused with a dose of good old-fashioned independent reading.
Disclaimer: This article is the personal opinion/view of the author(s) and is not necessarily that of the firm. The content is provided for information only and should not be seen as an exact or complete exposition of the law. Accordingly, no reliance should be placed on the content for any reason whatsoever and no action should be taken on the basis thereof unless its application and accuracy have been confirmed by a legal advisor. The firm and author(s) cannot be held liable for any prejudice or damage resulting from action taken on the basis of this content without further written confirmation by the author(s).