← Back to context

Comment by JumpCrisscross

21 hours ago

> they will ask an AI to outline relevant doctrines, show how their facts map to prior cases, and suggest the right records to pull

This is correct usage. Letting it draft notes and letters is not. (Procedural emails, why not.) Essentially, ChatGPT Pro lets one do e-discovery and preliminary drafting to a degree that’s good enough for anything less than a few million dollars.

I’ve worked with startups in San Francisco, where lawyers readily take cases on contingency because they’re so easy to win. The only times I’ve urged companies fight back have been recently, because the emails and notes the employee sent were clearly LLM generated and materially false in one instance. That let, in the one case that they insisted on pursuing, the entire corpus of claims be put under doubt and dismissed. Again, in San Francisco, a notoriously employee-friendly jurisdiction.

I’ve invested in legal AI efforts. I’d be thrilled if their current crop of AIs were my adversary in any case. (I’d also take the bet on ignoring an LLM-drafted complaint more than a written one, lawyer or not.)

No I think the big unlock is a bunch of people that would never file lawsuits can at least approach it. You obviously can’t copy paste its email output, but you can definitely verify what are legal terms, and how to position certain phrases.

  • > the big unlock is a bunch of people that would never file lawsuits can at least approach it

    Totally agree again. LLMs are great at collating and helping you decide if you have a case and, if so, convincing either a lawyer to take it or your adversary to settle.

    Where they backfire is when people use them to send chats or demand letters. You suggested this, and this is the part where I’m pointing out that I am personally familiar with multiple cases where this took a case the person could have won, on contingency, and turned it into one where they couldn’t irrespective of which lawyers they retained.

  • The legal system is extremely biased in favor of those who can afford an attorney. Moreover, the more expensive the attorney, the more biased it is in their favor.

    It is in effect not a legal system, but a system to keep lawyers and judges in business with intentionally vaguely worded laws and variable interpretations.

    • Exactly. And it’s comical that the person I was debating with doesn’t understand this. Proclaimed investor in legal tech misses the biggest use case of ai in legal - providing access to people that can’t afford it or otherwise wouldn’t know to work with a lawyer

      1 reply →