The legal industry is at a tipping point. Amid record-high first-year salaries, an explosion of lateral partner movement, and an uptick in merger activity, one question looms large for every firm: How...
America’s corporate suites are bracing for the impact of a steady flow of retiring executives, and the legal department is no exception to this demographic trend. The legal industry is “grappling...
In today’s legal market, innovation is central to how firms deliver value, attract talent, and grow profitably. The difference between firms that see marginal gains and those that lead the market...
By Serena Wellen, Vice President of Product Management at LexisNexis Legal and Professional May 22, 2025 A new article from Business Insider has brought yet another legal AI misstep into the spotlight...
By Serena Wellen, Vice President of Product Management at LexisNexis Legal and Professional May 21, 2025 LexisNexis® is proud to be leading the next wave of AI innovation in the legal industry...
* The views expressed in externally authored materials linked or published on this site do not necessarily reflect the views of LexisNexis Legal & Professional.
By Serena Wellen, Vice President of Product Management at LexisNexis Legal and Professional
May 15, 2025
In a legal industry increasingly shaped by AI, the rise of “AI hallucinations” — fake legal citations generated by large language models — has made accuracy and citation integrity a top concern.
This week, The Verge reported on a troubling case involving AI-generated legal research. A California judge sanctioned attorneys for submitting a legal brief containing fake citations and quotes, originally generated using AI tools.
The judge’s response was clear: “I read their brief, was persuaded (or at least intrigued) by the authorities that they cited, and looked up the decisions to learn more about them – only to find that they didn’t exist. That’s scary. It almost led to the scarier outcome (from my perspective) of including those bogus materials in a judicial order.”
This incident highlights a growing concern in the legal profession – the inclusion of fabricated or unverifiable legal citations by generative AI tools – and raises a critical question: Can you trust your legal AI to get it right?
At LexisNexis, our answer is: yes, and here’s why.
In the legal field, a “hallucination” occurs when an AI system fabricates a legal citation, case, or fact that sounds plausible but isn’t real. This is more than a technical glitch – it's a serious professional risk.
Whether you're drafting a brief, advising a client, or preparing for trial, relying on made-up case law or misquoted authorities can damage your credibility and your case. As shown in the recent case reported by The Verge, it can even result in court sanctions.
While some AI tools have generated misleading or fabricated legal citations – including non-existent cases or misquoted authorities – Lexis+ AI is built differently. We designed Lexis+ AI to deliver what legal professionals need most: trustworthy, verifiable responses based on real legal sources users can verify.
Lexis+ AI delivers legally sound responses backed by real, verifiable sources. Responses are drawn exclusively from the industry’s most trusted and expansive legal content repository, including Shepard’s®-reviewed case law, statutes, and Practical Guidance.
Now, with Document Management System (DMS) connectivity, Lexis+ AI can also ground responses in your firm’s internal knowledge including clauses from agreements, securely and in context. This enhances the relevance of responses by aligning them with how your firm structures and negotiates contracts.
For broader internal knowledge coverage such as precedent briefs, Protégé Vault allows you to securely upload a wider range of document types. This ensures that answers are not only legally sound but also aligned with your firm’s practice and voice.
Whether you're drafting a motion or reviewing strategy, Lexis+ AI delivers responses that are both externally authoritative and internally informed – with citations linked to real, verifiable sources. Citations are directly linked to the full source document, giving legal professionals the transparency and confidence they need to rely on what they cite.
Read more about how Lexis+ AI delivers AI checks and citation integrity.
Lexis+ AI isn’t just an overlay on an existing Large Language Model (LLM). It’s a purpose-built platform for legal professionals, with safeguards baked in at every level:
As recent headlines have shown, AI hallucinations in legal work are a real threat. But they are not inevitable.
Lexis+ AI was built to empower legal professionals with responses grounded in trusted, authoritative legal content, backed by our customer's own internal sources, and ready to support real-world decisions.
We don’t believe in shortcuts. We believe in accountability, transparency, and the kind of innovation that earns trust, not headlines.
To explore how Protégé in Lexis+ AI delivers fast, trusted responses with verifiable linked legal citations, visit Lexis+ AI.