top of page
4-RbMQtQWzw_edited.jpg

Are Litigants in Person Using AI Wrong?
5 Things Courts Actually Need to See

​

You're representing yourself in court. There is a lot on the line. And somewhere along the way, you discovered that AI can draft legal documents in seconds.

​

The question isn't whether you should use it. The question is whether you're using it right.

​

Here's what most litigants in person don't realise.  Judges can spot AI-generated submissions instantly. Not because the technology is inherently wrong, but because most people use it without understanding what courts actually need to see.

​

The truth? AI can be your most powerful ally or your biggest liability. The difference comes down to five critical elements that separate empowered litigants from those who undermine their own cases before they even walk into the courtroom.

 

The Real Problem Isn't AI: It's Verification

Let me be direct.  AI hallucinates legal authorities.


It will cite cases that don't exist. It will reference legislation that was never passed. It will present arguments based on legal principles that sound authoritative but have no basis in actual law.


This isn't a flaw you can work around by using a "better" AI. This is how the technology fundamentally operates. Large language models generate text based on patterns, not truth. They don't fact-check. They don't verify. They create plausible-sounding content.


And courts? They have zero tolerance for fictional case law.

jzud55lIl8X.webp

Several judges have already issued sanctions against litigants who submitted AI-generated documents containing fabricated citations. The responsibility for accuracy doesn't shift with the technology: it stays with you. Always.

​

What courts need to see.  Every legal authority verified through official sources. Every case citation checked against actual law reports. Every statutory reference confirmed through legislation.gov.uk or equivalent databases.

​

This is where most litigants using AI fail. They treat the output as finished work rather than a first draft requiring rigorous verification.

1. Legitimate Legal Authority (Not AI Fantasies)

Your witness statement might be beautifully formatted. Your Particulars of Claim might read like they were drafted by a Silk. But if you cite Smith v Jones [2023] EWHC 789 and that case doesn't exist, you've just destroyed your credibility.


Courts operate on precedent. On established law. On verifiable authorities that can be checked, referenced, and relied upon.


When you submit documents containing hallucinated cases, you're not just making a mistake. You're demonstrating to the judge that you either don't understand the legal system or you're willing to mislead the court. Neither interpretation helps your case.


The solution: Use AI to help you understand legal concepts and structure arguments. Never use it as your primary source for case law or statutory references. Verify everything through LexisNexis, Westlaw, BAILII, or official legislation sites.

Cqg_wO9dO01.webp

2. Structured Documentation That Follows Court Rules

Here's where AI actually shines and where you should be using it properly.


Courts have specific requirements for how documents are formatted, bundled and presented. Practice Directions. Civil Procedure Rules. Court-specific protocols. Getting these right isn't optional; it's what separates prepared litigants from those who waste the court's time.


AI can help you:

  • Structure witness statements with proper headings and numbered paragraphs

  • Create chronological indices for your trial bundle

  • Format applications according to CPR requirements

  • Draft skeleton arguments that follow logical progression
     

This is powerful. This is where technology genuinely narrows the access to justice gap.


What courts need to see:  Documents that demonstrate you understand procedure. Bundles organized logically. Submissions that respect the court's time by being clear, concise, and properly indexed.

You're not expected to match a £XXX per hour barrister's work product. But you are expected to present your case in a way that allows the judge to follow your arguments without hunting through disorganized paperwork.

3. Your Authentic Voice in Witness Statements

Judges read hundreds of witness statements. They know what genuine testimony sounds like and they can immediately identify when someone has fed their story into  AI and submitted the polished output.


The problem isn't that it sounds too good. The problem is that it sounds like everyone else's AI-generated statement.  Same phrases, same structure, same artificial tone that bears no resemblance to how actual human beings describe their experiences.


What courts need to see:  Your authentic voice. Your actual recollection of events. The specific details that only you would know and remember.


Use AI to help structure your statement logically. Use it to ensure you're including all relevant evidence. Use it to check that you've followed the proper format.
But, the words describing what you saw, heard, and experienced? Those need to be yours. Authentically, demonstrably yours.

jzud55lIl8X.webp

4. Evidence of Critical Thinking (Not Copy-Paste Arguments)

Here's what happens when you rely entirely on AI for legal arguments.  You end up submitting generic points that don't engage with the specific facts of your case.
AI can give you boilerplate arguments. It can explain general legal principles. It can even draft sample responses to common scenarios.


What it cannot do is apply nuanced legal reasoning to the unique circumstances of your dispute. That requires understanding. Context. The ability to distinguish your situation from superficially similar cases.


Courts need to see that you've thought about why the law applies to your facts. Not just that you've copied impressive-sounding legal arguments and hoped they're relevant.
This is where the Lawpreneur® 3-step process becomes essential:

  • Prevention: Understanding the legal framework before disputes arise

  • Damage Control: Knowing how to respond when conflicts emerge

  • Damage Limitation: Presenting your case strategically to minimise adverse outcomes


Each step requires critical thinking that AI cannot replicate. It requires understanding not just what the law says, but how it applies to your specific circumstances.

5. Transparent Use of Technology

Courts are beginning to require disclosure when AI has been used to generate legal documents. This isn't because using AI is wrong, it's because judges need to assess the reliability of submissions.


When you're transparent about your process, you demonstrate integrity. When you try to pass off AI generated work as entirely your own, you create suspicion about everything you submit.
 

What courts need to see.  Honesty about your process. Acknowledgement that you've used technology as a tool, combined with verification that you've checked its output. Evidence that you understand your case beyond what an algorithm generated.


This is about accountability. Courts have stated clearly that while AI technologies may expand access to justice, they "will be vigilant against AI technologies that jeopardise due process, equal protection, or access to justice."


Your job as a litigant in person is to use these tools in ways that enhance justice, not undermine it.

NzdVSJ16ObI.webp

The Right Way to Use AI as a Litigant in Person

Using AI isn't wrong. Using it carelessly is catastrophic.

​

The distinction comes down to understanding what courts actually need from you:

  • Verified legal authorities that can withstand scrutiny

  • Properly structured documentation that respects procedure

  • Authentic testimony that reflects your genuine experience

  • Critical thinking that applies law to your specific facts

  • Transparent practice that builds rather than destroys credibility
     

This is exactly what you get a taste for in the Empowered Litigants Masterclass, how to leverage technology without becoming dependent on it. How to prepare cases that judges take seriously. How to navigate the court system with both competence and confidence.
 

You're not expected to become a lawyer overnight but you are expected to present your case properly, to verify your authorities and to understand the difference between helpful tools and dangerous shortcuts.
 

AI can help you get there but only if you use it right.
 

The courts aren't asking whether you used AI. They're asking whether you've done the work to ensure your case is built on solid legal ground, properly presented and worthy of their time.
 

That's not a technology question. That's a preparation question.
 

And preparation? That's where empowered litigants separate themselves from those who simply hoped the algorithm would do the work for them.

 

Ready to learn the right way to represent yourself? Discover how Lawpreneur® equips you with the knowledge and frameworks that AI can never replace.
 

This content is provided for general informational and educational purposes only and does not constitute legal advice. No solicitor-client relationship is formed by your use of this information. While I strive for accuracy, the law changes frequently; you should always consult a qualified legal professional regarding your specific circumstances. Lawpreneur® and its contributors accept no liability for actions taken based on this content.
 

Information provided by Lawpreneur® AI assistants or social media posts is for guidance only. It is not a substitute for professional legal advice. Always verify important legal steps with a qualified practitioner. Use of this information is at your own risk.

 

 

 

Disclaimer:

The views, information and opinions expressed are solely the personal views of Cynthia McFarlane, acting in her personal capacity.  These are her own views and do not reflect the view of any chambers, regulatory body or institution the author is affiliated with.

​

©2026 Cynthia McFarlane 

Privacy Notice

Cookie Notice

Website Terms of Use

​

bottom of page