The chatbots are coming! The chatbots are coming!  And, apparently, they want to practice law.

A New York attorney recently discovered the significant risk of relying on ChatGPT to do one’s legal research.  In the case, the plaintiff, Robert Mata, claimed his knee was injured when it was struck by a serving cart during a flight, and he sued the airline, Avianca Airlines.  Avianca’s attorneys filed a motion to throw out the case, arguing the statute of limitations had expired.  In response, Mata’s attorney, Steven Schwartz, submitted a 10-page brief, chock full of legal citations as to why the airlines’ motion should be denied.  Unfortunately, at least six of the cases were bogus – completely fabricated.

When Avianca’s lawyers pointed out that they could not find any of those cases, Schwartz admitted that he had used ChatGPT to perform his legal research.  Introduced in November 2022, ChatGPT is an artificial intelligence chatbot that uses natural language processing to create humanlike conversational dialogue. In response to a user’s queries, the program can answer questions and compose various written content, including articles, essays, and – at least Schwartz thought – legal briefs.

In a recent hearing on possible sanctions against Schwartz and another lawyer, Schwartz apologized to the court, conceding he had “failed miserably” at vetting the authorities which the bot provided.  He told an angry U.S. District Judge P. Kevin Castel that he was “operating under a misconception … that this website was obtaining these cases from some source I did not have access to.”

At the very least, can you trust ChatGPT to tell you if it is making stuff up? Apparently not. In a copy of the exchange submitted to the judge, Schwartz asked ChatGPT: ‘Is varghese a real case’ and ‘Are the other cases you provided fake.’  The bot replied: ‘No, the other cases I provided are real and can be found in reputable legal databases.’

The takeaway for lawyers is clear:  do not rely on ChatGPT for your legal research. More broadly, the case underscores the fact that the ChatGPT platform is still limited and unreliable.  Thus, anyone doing online research which they intend to use in business or academics should not blindly quote from or rely on the results of a chatbot query – or a Google search for that matter.  Verify your sources!

The judge has not yet ruled on whether Schwartz will be sanctioned.

 

  1. “I’m Afraid I Can’t Do That Dave”