Tag Archives: consumers

Legal chatbots in 2025

We have in the past dedicated articles to legal chatbots in 2016 and 2019. It is time for an update. In this article, we discuss trends and adoption of legal chatbots, as well as existing regulation. Then we look at legal chatbots for consumers and legal chatbots for law firms. We will do so for the US (because they are still the market leaders), the UK, and the EU.

Trends and adoption

The US has seen a rapid growth of bots / AI agents in law firms: AI adoption in US law firms surged from 19% in 2023 to 79% in 2024, with chatbots playing a central role. This market expansion is still ongoing: the US legal tech market is projected to reach $32.54 billion by 2026, with chatbots as one of the main drivers.

In the UK, adoption is most advanced in large, business-to-business (B2B) law firms. Chatbots are integrated with legal analytics, project management, and contract management systems. In contrast, we find that the B2C market lags: the business-to-consumer market is slower to adopt. Legal chatbots are most popular in firms with large-scale, commoditized services. Chatbot and AI adoption is slowed down by a lack of awareness and uncertainty about the role of AI: over one-third of UK legal professionals remain uncertain about the application of generative AI and chatbots in legal work.

In the EU on the other hand, we are witnessing an increasing adoption. There is a steady rise in chatbot use for routine legal tasks, especially among consumers and SMEs. Chatbots are also seen as tools to improve access to justice, particularly for underserved populations and cross-border matters. At the same time, there are ongoing ethical and legal debates, as there are concerns about accuracy, liability, and bias in AI-generated legal advice.

Regulation

In recent years, there has been a move towards regulating the use of AI, which also affects the use of legal chatbots.  There is a need for transnational regulation, but thus far, each region just does its own thing.

In the US, we are confronted with fragmented regulation. The US lacks a comprehensive federal AI law. As a result, regulation is piecemeal: we are dealing with a) state-level initiatives and b) professional (ethical) conduct rules that guide how lawyers can use AI. When it comes to legal chatbots specifically, there is a requirement for professional oversight. In other words, chatbots cannot independently practice law, and human supervision is required to avoid unauthorized practice and to ensure accuracy. And of course, law firms must consider privacy and security when using legal bots. Compliance with privacy laws is essential, especially when handling sensitive client data.

It is worth noting that the FTC (Federal Trade Commission) has made clear that bots cannot market themselves as “robot lawyers” or a substitute for licensed counsel without substantiation. Its 2024 enforcement against DoNotPay (a consumer rights bot we discussed in previous articles) resulted in a $193,000 penalty and strict advertising restrictions. This FTC ruling is widely cited as the line in the sand for consumer legal AI claims.

Furthermore, the American Bar Association’s first formal opinion on generative AI (Formal Opinion 512, 2024) says lawyers must a) understand the capabilities and limits of AI, b) protect confidentiality, c) supervise outputs, and d) be candid with courts and clients. They do not need to be “AI experts,” but they can’t delegate professional judgment to a bot. Several bar associations and courts have issued similar guidance.

The UK relies on flexible, sector-specific laws and regulation, with a focus on transparency, explainability, and data protection (UK GDPR). Add to that, that legal professionals must ensure chatbots comply with professional ethical standards, including confidentiality and competence.

In the EU, we find regulation on both the EU level, as well as on the national level. On the EU level, the GDPR and the EU AI Act are the most important regulations. The GDPR has strict data privacy requirements which also apply to chatbot operations, especially with sensitive legal data. The EU AI Act introduces risk-based regulation, with high-risk applications (like legal advice) facing stricter requirements for transparency, accuracy, and human oversight.

Apart from the EU regulations, we also find that some National Bar Associations have issued their own regulations. As a result, in some countries only licensed lawyers can provide legal advice. This effectively limits the chatbot scope and/or requires professional supervision.

Legal chatbots for consumers

In previous articles on legal chatbots, we mainly discussed legal chatbots for consumers. What they all have in common is that they facilitate access to legal information. They democratize legal knowledge, making it more accessible to the public. (Links in the introduction). Overall, there still is a steady rise in chatbot use for routine legal tasks, especially among consumers and SMEs.

Legal chatbots for law firms

Apart from chatbots for consumers, in recent years we have also witnessed an increase in the number of legal chatbots for law firms. What are they used for?

  • Automation of routine tasks: chatbots automate legal research, contract review, and administrative work.
  • Document automation: bots are assisting lawyers with the creation and review of standard legal documents.
  • Legal research: AI chatbots can scan and summarize large volumes of legal documents and precedents rapidly.
  • Client engagement and intake: they are also used to handle initial queries, provide information, and schedule appointments, and they can direct clients to appropriate services or professionals.
  • Provide a better consumer experience: some law firms use their own legal chatbots to offer consumer services. By doing so, they enhance accessibility in areas like small claims, tenancy issues, and basic legal advice.

Conclusion

Legal chatbots have become an essential part of legal services in the US, UK, and Europe. Big law firms and routine legal services have been the quickest to adopt these technologies, but now we’re seeing more tools that help everyday people access legal help.

Regulatory frameworks are evolving rapidly, with the EU leading in comprehensive risk-based regulation, the UK favouring sector-specific guidance, and the US maintaining a fragmented, state-driven approach. Across all regions, the focus is on balancing innovation with ethical, professional, and data privacy safeguards.

At present, the US is still leading the way when it comes to legal chatbots. Most research/drafting bots originate in the U.S. (Thomson Reuters, Lexis, Harvey, Bloomberg). The UK on the other hand, is presenting itself as a contract-review hub: tools like Luminance and Robin AI grew out of the U.K.’s startup ecosystem. Continental European firms use a mix of U.S./U.K. platforms under GDPR controls, but also homegrown tools like ClauseBase and Legito for contract/document automation.

 

Sources: