Law news
AI as a Legal Advisor: Helpful or Harmful?
Posted on behalf of: Shadeh Evans, Law Student
Last updated: Monday, 20 April 2026
Illustration credit: Alghozy - Unsplash
You find yourself with some sort of legal problem, where do you turn for advice first? Ten years ago, most people would say either friends and family or a legal professional. In 2026, many people would say ChatGPT. Reliance on AI for legal advice is rapidly increasing, but is this ultimately helpful or harmful?
Most clients who come to the clinic want our help to explain their legal rights. However, in the case of Sharron (not her real name), she already knew her rights in a very niche, complicated area of family law. It was clear that she had used AI for advice about her rights and now wanted the Clinic’s perspective on the strength of her case in court. This example connects to another trend we’ve observed in the Clinic: a shift in the nature of enquiries we receive. Where in the past most questions were about basic family law (like the divorce process or sorting out parenting time), we are now seeing more complex issues that often fall outside our scope. Additionally, despite a 3% increase in family law cases in 2025 from 2024, we’ve noticed a significant drop in the number of overall enquiries. Taking these observations together, a correlation between the increased usage of AI and reliance on our clinic seems to be emerging.
This reliance on AI could be problematic because there is a difference between what AI advises and what a solicitor will advise, especially in family law. Going back to the case of Sharron, because she knew that she had a right to use the court, she was set on going to court as her only option to resolve her issue. When we asked if she had thought about mediation, she said no and that she did not even consider mediation for her type of issue. When speaking with our supervising solicitor, she said that most solicitors will advise clients to settle outside of court and to use the court as a last resort. On the other hand, it seems that AI-generated advice can sometimes suggest the use of the court as the first (or best) option. Another issue with relying on AI instead of a legal professional is the possibility of false or wrong information. There have been many times that AI has made up facts, cases, and laws known as “AI hallucinations.” Due to the limited availability of publicly accessible legal information—particularly in family law—there is a greater risk of these hallucinations occurring.
But why do so many people rely on AI to help them with their legal issues? The first and most obvious reason is convenience. You can ask AI anything at any time, anywhere and get an instant answer. It is also more convenient than a simple Google search since you can ask follow-up questions and have it build on its answers. The deeper, less obvious reason is the cuts to legal aid. Cuts to legal aid made by LASPO 2012 were drastic and had a huge impact, especially in family law. Since the cuts, only cases involving domestic abuse or child protection issues are eligible for legal aid support. This has made access to justice and legal resources even harder for the general public than it already was, for example, before LASPO 2012, 4 in 5 people were eligible for civil legal aid, but after that, it dropped to 1 in 4.
In this context, it is not surprising that people are turning to AI as an alternative source of legal guidance. However, while AI may improve accessibility, it cannot replace the nuanced, strategic, and human-centred advice provided by legal professionals. As seen in Sharron’s case, knowing one’s rights is only part of the picture—understanding how to act on those rights, including when to avoid court altogether, is equally important. Even though AI presents the opportunity to give people legal information they may not otherwise have access to, it also presents risks of misleading people or pushing them towards less effective legal routes. The rise of AI reliance for legal issues is not just a technological shift, but a reflection of a system where many people feel they have nowhere else to turn.