Bertus Preller is a family and divorce law attorney and author of two books, with 35 years of experience.
Image: Supplied
In a viral essay titled "Something Big Is Happening" published on 9 February 2026, AI entrepreneur Matt Shumer draws a stark parallel between the early days of the COVID-19 pandemic and the current trajectory of artificial intelligence.
Shumer, co-founder and CEO of OthersideAI, a company behind HyperWrite, an advanced AI autocomplete tool and an investor in cutting-edge AI ventures like Groq and Etched, has spent over six years building and funding AI startups.
With a background that includes founding tech companies while still in high school and studying at Syracuse University, Shumer positions himself as an insider witnessing seismic shifts.
He warns that AI is no longer a mere assistant but a force capable of autonomous, judgment-like decision-making, already displacing roles in tech and poised to do the same across professions, including law.
His essay, which has garnered over 80 million views on X (formerly Twitter), emphasises that the technology's exponential progress could eliminate 50% of entry-level white-collar jobs within one to five years, urging professionals to adapt urgently.
Shumer's insights resonate particularly in the legal field, where he recounts conversations with a managing partner at a major firm in the US who now spends hours daily using AI as a virtual team of associates. This reflects a broader disruption: AI is evolving from handling rote tasks to performing complex analyses, drafting briefs, and even simulating strategic thinking.
Yet, as Shumer notes, many lawyers remain skeptical or underutilise the technology, treating it like a basic search tool rather than a collaborator. This hesitation mirrors the legal sector's historical sluggishness in adopting innovations, from electronic filing to cloud-based systems, often due to concerns over reliability, ethics, and tradition.
In South Africa, this slow adoption is even more pronounced, compounded by resource constraints and a conservative professional culture. While global firms race ahead, South African practitioners have only recently begun experimenting with AI, starting around 2018 for document automation and accelerating during COVID-19 lockdowns.
Today, leading firms like Bowmans, ENS, Webber Wentzel, and Cliffe Dekker Hofmeyr are integrating tools such as Harvey AI for predictive analytics and contract review, reducing document processing time by up to 70%.
Initiatives like the University of Cape Town's Legal AI Clinic are training future lawyers, and companies such as Legal Interact and Legal & Tax have launched South Africa's first AI-powered legal bot to democratize access to justice.
Regulators, including the Financial Sector Conduct Authority (FSCA) and South African Reserve Bank (SARB), are studying AI's impact through reports like their 2025 joint analysis, which highlights banks and fintech's leading adoption while planning investments exceeding R30 million in 2026.
The Department of Communications and Digital Technologies' National AI Policy Framework, released in late 2024, promotes sector-specific strategies to harness AI in healthcare, education, and finance, emphasising ethical innovation.
However, AI's influence brings significant risks, chief among them "hallucinations", fabricated yet plausible outputs that can mislead users.
This issue has surfaced in South African courts, underscoring the need for caution.
In the 2025 case of Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others [2025] ZAKZPHC 2, a legal team submitted heads of argument citing nine authorities, only two of which were real; the rest were ChatGPT-generated fictions. The High Court deemed this "irresponsible and unprofessional," referring the matter to the Legal Practice Council for investigation.
Similarly, in Northbound Processing (Pty) Ltd v South African Diamond and Precious Metals Regulator and Others (2025/072038) [2025] ZAGPJHC 661, counsel relied on a subscription-based AI tool called Legal Genius, which produced fictitious citations due to time pressures. The court condemned this as a breach of duty, emphasizing that even negligent use of AI constitutes misconduct.
These cases highlight how early AI tools, prone to inventing case law or misstating principles, can erode judicial trust and bring the administration of justice into disrepute. Despite these pitfalls, advancements are addressing them. For instance, Legal Genius evolved to version 4.0 by incorporating Retrieval-Augmented Generation (RAG), which grounds responses in verified sources like the South African Legal Information Institute (SAFLII) and LAW LIBRARY databases, effectively eliminating hallucinations.
This shift from general models to specialised, fact-checked systems demonstrate potential when responsibly refined. Broader reforms are emerging, with the Law Society of South Africa reviewing proposed Ethics Guidelines for Generative AI, submitted in 2024, which stress transparency, verification, and accountability.
These align with international precedents, such as the English High Court's warnings in Ayinde v The London Borough of Haringey: Al-Haroun v Qatar National Bank, endorsed in South African judgments.
The disruptive nature of AI demands proactive engagement from lawyers. As Shumer advises, professionals should move beyond casual use: subscribe to advanced models, integrate them into workflows for tasks like summarising judgments or drafting pleadings, and iterate with clear prompts.
In South Africa, this could enhance access to justice in underserved areas, making services faster and more affordable under frameworks like the Protection of Personal Information Act (POPIA). Yet, enthusiasm must not override ethics, the buck stops with the lawyer or the judge. Outputs must be rigorously checked against original sources, statutes, and precedents to avoid biases or errors.
Blind reliance has led to sanctions; diligent verification upholds professional integrity. In conclusion, AI's integration into law, as forewarned by figures like Shumer, promises revolutionary efficiency but requires balanced reforms. South Africa's legal fraternity must accelerate adoption while prioritising ethical guidelines to navigate this transformation. By doing so, practitioners can harness AI's power to advance justice, rather than risk being left behind in an increasingly automated world.
As a South African attorney with many years of hands-on experience in information technology and coding, I believe the legal profession here and elsewhere has been noticeably slow to adopt emerging technologies, largely because of a combination of unfamiliarity, caution, and a longstanding preference for tried-and-tested manual processes.
Artificial intelligence now offers genuine opportunities to make legal research, document preparation, contract analysis, and even aspects of case strategy significantly more efficient, especially as tools become better tailored to South African law, statutes, and reported judgments.
That said, AI remains far from perfect: it is still capable of producing “hallucinations”, confident sounding but completely invented facts, case citations, or legal principles which can cause serious harm if placed before a court or relied upon in advice to clients. For that reason, no matter how advanced the system becomes, the attorney, Advocate or Judge cannot delegate final responsibility. Every AI-generated output must be carefully checked against the original primary sources: the statutes themselves, the reported judgments on SAFLII or other official repositories, textbooks, and the actual reasoning of the courts.
The ethical and professional duty remains squarely with the lawyer or judge; thorough human verification is non-negotiable if we are to preserve the integrity of the judicial process and protect our clients and to the general public.
In short, AI should be treated as a powerful assistant, not a replacement for professional judgment. As a South African attorney with extensive experience in information technology and coding, I am convinced that artificial intelligence will disrupt the legal fraternity to an unprecedented and largely unknown extent within the next few years.
While AI tools are already automating routine tasks like legal research, contract drafting, and preliminary case analysis, the rapid advancements, such as models capable of near-autonomous reasoning and decision-making will soon challenge even the core functions of experienced lawyers, from strategic advocacy to courtroom preparation.
In South Africa, where access to justice is often hampered by resource shortages, this could democratise legal services for the masses but simultaneously erode traditional roles, forcing firms to rethink billing models, ethical guidelines, and professional training. The scale of this transformation is unpredictable, as AI's exponential growth could render many current practices obsolete, potentially leading to widespread job displacement, regulatory upheavals, and a fundamental shift in how justice is administered, unless the profession adapts swiftly with robust oversight to harness its benefits while mitigating risks like inherent biases and inaccuracies.
In my opinion, the traditional billable hour model, long the cornerstone of legal billing in South Africa and elsewhere, faces existential disruption from AI within the next few years. As artificial intelligence rapidly automates research, drafting, document review, and even initial strategy formulation, tasks that once reliably generated hours of chargeable time, clients will increasingly demand fixed fees, value-based pricing, or outcome-driven arrangements, rendering the old hourly model unsustainable for many routine and mid-level matters.
The profession will therefore need to pivot swiftly toward hybrid or entirely new billing structures that reward efficiency, expertise, and results rather than time spent, or risk losing relevance in a market where technology delivers faster, cheaper, and often superior work product.
*Bertus Preller is a family and divorce law attorney and author of two books
**** The views expressed do not necessarily reflect the views of IOL or Independent Media.
Related Topics: