Legal tech AI firm Legora has raised $150 million in a Series C led by Bessemer Venture Partners, valuing the Stockholm-based company at $1.8 billion.
The fundraise comes amid sharp demand: Legora’s customer base has risen from 350 to over 550 in six months, with operations now spanning more than 50 countries. Clients include Linklaters, Goodwin and Cyril Amarchand Mangaldas (CAM), which reports an AI adoption rate above 80%. “We want fewer, more powerful platforms where a lawyer can work end-to-end without switching tools,” said Komal Gupta, CAM’s chief innovation officer.
The customers in India today are mainly law firms and in-house legal teams, said Amit Kothiyal, Legora’s India head, adding that the company eventually aims to reach individual lawyers as well. AI, he noted, could help firms offer services at lower cost and scale beyond the limits of the billable hour.
Legora is not alone in drawing big cheques.
Harvey—one of the sector’s most closely watched players—raised $150 million in October 2025 in a round led by Andreessen Horowitz, valuing it at $8 billion. It was the company’s third major fundraise of the year. Harvey has also set up an engineering and operations hub in Bengaluru to support its expansion across Asian markets.
Another US-based player, August, raised $7 million in August 2025 in a round led by NEA and Pear VC. The company has already partnered with India’s Economic Laws Practice (ELP), where its AI tools have reportedly cut due diligence time by up to 60%.
Indian legal tech startups
India, too, is seeing a quickening pace of legal tech funding. Bengaluru-based Nyayanidhi has raised $2 million in seed funding led by 3one4 Capital. Co-founder Adithya L.H.S. told Mint that a significant share of its expenditure goes into data cleaning and annotation, with the company operating on a pay-per-use model, though pricing details were not disclosed.
Key Takeaways
- Global and Indian legal tech firms are raising significant capital, signalling strong investor confidence and market demand.
- The primary structural challenge is making unstructured legal data machine-readable, which consumes 70-80% of the engineering effort and incurs large capital expenditures.
- AI tools are beginning to augment the work of junior legal staff by automating routine tasks like research and drafting, indicating a change in traditional job profiles.
- The Supreme Court’s White Paper formalizes AI adoption but strikes a cautious tone, limiting AI to non-adjudicatory, administrative functions due to risks like fabricated citations and algorithmic bias.
- Despite the rise of decision-support platforms, the ultimate legal responsibility and the need for professional judgment remain paramount for the practising lawyer, as confirmed by both industry messaging and expert opinion.
Nyayanidhi plans to utilise the new capital to expand its lawyer network and deepen government integrations as it scales to more states. Its pilots, with High Courts and enterprise clients, have already processed thousands of cases, and it currently operates within the Karnataka High Court, supporting multilingual drafting and filings, Adithya said.
Lucio, too, has raised $5 million from DeVC and a group of high-net-worth individuals (HNWIs).
“Our primary customers are law firms and corporate legal teams, though the platform also serves individual litigants and smaller practices,” said co-founder Vasu Aggarwal.
However, the company says AI cannot yet bypass human judgment. “We’re a legal tech company, not a law firm. The platform is decision-support only, never a substitute for professional judgment,” he said, adding that this is explicit in their product messaging and contracts.
Lucio runs a subscription model—per user or bundled—though pricing details were not disclosed. Just like Nyayanidhi, a major share of engineering effort goes into making unreadable PDFs usable.
In India, building a production-grade pipeline to clean court PDFs can cost ₹4–10 crore over two years, said Ranjeeth Bellary, partner, EY Forensic and Integrity Services.
Industry executives peg data cleaning as the bulk of legal tech’s workload, with 70-80% of effort spent cleaning and structuring messy PDFs and only 20% going into actual AI development.
Judiciary flags risks of AI misuse
The traditional job profile for junior lawyers is also shifting as automation takes over routine tasks. “Legal tech tools can augment the work of junior legal staff who would otherwise spend significant time on research or drafting,” said Devroop Dhar, co-founder of Primus Partners, a management consulting firm.
This push toward AI in legal services comes even as India’s judiciary formalises its own stance.
The Supreme Court’s Centre for Research and Planning released a White Paper on Artificial Intelligence and the Judiciary late last month, outlining how courts are experimenting with AI while emphasising its risks.
The paper strikes a cautionary tone, warning of fabricated citations, algorithmic bias, confidentiality breaches involving sensitive court records, and the broader danger of over-reliance on opaque systems that could weaken judicial accountability.
These concerns stem from real incidents, including a Karnataka trial court order and an Income Tax Appellate Tribunal ruling that relied on AI-generated fictitious precedents, as well as the Delhi High Court’s rejection of pleadings containing fabricated judgments.
This is precisely why, experts say, liability still sits largely with lawyers.
Platforms may absorb the commercial or reputational fallout, Bellary said, but the practitioner’s judgment remains paramount—and the legal responsibility ultimately rests with them, added Dhiren Sonigra, principal, AI Solutions at Praxis Global Alliance, a management consulting firm.
The paper situates AI adoption within the ₹7,210-crore e-Courts Project Phase III, initiated by the Supreme Court, which is driving digitisation through tools for research support, translation, transcription, filing scrutiny, and summarisation. Yet it sharply limits the role of AI to non-adjudicatory, administrative functions such as scheduling and registry workflows. It emphasises the importance of in-house, tightly controlled systems and continuous human oversight, underscoring that AI should not influence substantive judicial decision-making.


