Joshua Browder promised the world its first AI lawyer — a robot that would argue a real speeding ticket in real court, feeding arguments through an earpiece. Then state bars in California and New York threatened him with criminal prosecution. The robot lawyer never made it to the courtroom.
Joshua Browder was 22 when he built the first version of DoNotPay — a simple website that could automatically generate letters to contest parking tickets. By the time he was 26, DoNotPay had expanded into subscription cancellations, small claims court filings, and consumer rights disputes. Its marketing tagline was bold: "The World's First Robot Lawyer."
Browder was magnetic and combative — he positioned DoNotPay as a weapon for ordinary people against the institutions that exploited them. The company had helped users file over 160,000 successful parking ticket appeals, claim airline compensation, and navigate bureaucratic processes that typically required expensive professional help. Now he was ready for a bigger stage.
In January 2023, Browder announced DoNotPay's most audacious experiment: a volunteer defendant with a real speeding ticket case would wear wireless earbuds in court. An AI would listen to the hearing through the phone's microphone, process the judge's questions and the prosecution's arguments in real time, and feed legal arguments into the defendant's ear. The defendant would repeat these arguments to the judge.
The case was scheduled in California in February 2023. Browder said he had a willing participant. He announced it publicly. Legal Twitter noticed immediately.
Within days of Browder's announcement, state bars in California and New York had taken notice. Legal scholars, practicing attorneys, and bar association officials began pointing out the problem: what Browder was describing — an AI system providing real-time legal strategy and argument to a person in active court proceedings — was, by any reasonable definition, the practice of law.
In most US states, practicing law without a license is a criminal offense. In California and other states, it can constitute a felony. The bar associations communicated to Browder that both he and the defendant volunteer could face criminal prosecution if the earpiece experiment went forward.
The speeding ticket case was scheduled. The volunteer was ready. The AI was ready. The earbuds were probably charged. And then Browder pulled the plug.
Rather than go quietly, Browder made one more move. He announced a challenge: DoNotPay would pay $1,000,000 to any licensed attorney willing to argue a case before the United States Supreme Court using DoNotPay's AI system through an earpiece.
No lawyer took him up on it. The reasons are obvious: arguing before the Supreme Court using untested AI fed through an earpiece, potentially in violation of court rules and professional conduct obligations, is not a risk most attorneys would take for any amount of money. The challenge was more performance than offer.
The episode raised a question the legal profession had been quietly dreading: what is the practice of law in the age of AI? Legal research, brief writing, document drafting — AI was already doing these things at scale. DoNotPay and its competitors were helping millions of people with legal tasks that had previously required attorneys.
The line between "legal information" and "legal advice" had always been blurry. A website telling you "here's how to contest a parking ticket" is providing legal information. An AI listening to your court case and telling you what to say is providing legal advice — and arguably doing so in a way that looks, sounds, and functions exactly like having a lawyer.
Browder's earpiece had tried to cross that line in the most visible, provable, and public way possible. The bar associations weren't going to let that happen in open court. The question of where the line actually sits remains deeply unresolved.
The robot lawyer never argued its case. The traffic ticket got paid the normal way. But the question Browder's stunt forced into the open — who gets to provide legal help, and at what cost, and on whose terms — is still being argued today in courts, bar associations, law school journals, and legislative hearings.
If an AI can write a better legal brief than most attorneys, at a fraction of the cost, who benefits? The 80% of Americans who can't afford a lawyer, or the legal profession that's structured around making legal help expensive? The robot lawyer blinked. The question didn't.