From: redpointai

Max Unistron, CEO and co-founder of Lorra, a company at the forefront of applying AI to the legal industry, discussed the intricacies of product development and prioritization in the rapidly evolving AI startup landscape. Lorra has raised over $100 million and works with many top law firms globally, making it one of the fastest-growing AI applications [00:00:31].

When Lorra began in 2020, AI models like early BERT models from Google were “horrendously bad” in languages other than English [00:01:46]. The arrival of GPT 3.5 marked a paradigm shift, moving the industry from full experimentation to implementing solutions that handle end-to-end work deliverables [00:01:52]. For example, due diligence processes that once required physical data room visits now involve simply uploading documents to Lorra, finding information, and generating reports based on the findings [00:02:08]. The focus has shifted from simple queries to defining processes for Large Language Models (LLMs) to follow, using agents that access tools to plan and execute tasks [00:02:29].

The continuous improvement of models, coupled with surrounding frameworks like function calling and tool calling, is enabling a consolidation of previously fragmented legal software tools (e.g., separate tools for translations, document comparisons, searching, and reviewing) [00:02:53]. While simple tasks like data extraction are already largely automated, AI is steadily moving up the complexity scale to tackle tasks like drafting share purchase agreements [00:03:39]. This evolution forces legal professionals to determine where their specific expertise adds the most value versus where off-the-shelf LLMs suffice [00:03:50].

Why Law is Uniquely Suited for AI

The legal industry has historically seen limited software development, with templating systems being considered cutting-edge [00:04:34]. A dilemma exists between law firms, which handle complex, one-off projects, and in-house councils, who repeatedly work on similar tasks like NDA reviews [00:04:55]. Legal work can broadly be categorized into reviewing, reading, drafting, writing, or researching [00:05:20]. While earlier software focused on niche problems within these categories, AI’s cross-functional capabilities have enabled platforms like Lorra to emerge, offering wall-to-wall solutions rather than point solutions [00:05:32].

Law firms are increasingly pressured by clients to adopt AI, as clients themselves are integrating such tools internally [00:06:09]. The shift to an “AI-first” mindset among CEOs means new headcounts require proof of efficiency [00:06:19]. While hourly billing traditionally disincentivized efficiency, client demand and increasing price pressure (e.g., due diligence becoming less profitable) are forcing firms to adapt [00:06:50]. The competitive landscape means if one firm adopts AI for efficiency, others must follow to avoid being seen as inefficient [00:08:00].

Product Strategy and Competition

Starting in the Nordics allowed Lorra to grow from a “small fish” to a dominant player in a smaller market before expanding globally [00:08:58]. This approach provided the advantage of being a “fast second mover,” allowing them to observe what worked for others [00:09:40]. Initially, many AI companies focused on training their own LLMs, but Lorra, with significantly less initial funding ($50,000 angel round), chose to focus on the application layer, building a product people were excited to use [00:09:50]. Coming from a non-legal background also fostered humility and attentiveness to client needs and the evolving client-law firm relationship [00:10:19].

This strategy enabled Lorra to be more ambitious in its product scope, servicing the “totality of the needs of the market” rather than a narrow niche [00:11:00]. By serving enterprise clients in their local market first, they became “enterprise ready” when entering new markets, leveraging the interconnectedness of large law firms for referrals [00:11:52].

User Adoption and Education

Teaching lawyers to use AI tools takes “way more work” than anticipated [00:12:44]. While traditional software rollouts might aim for 5-10% adoption, Lorra is seeing 70-80% adoption rates, partly because lawyers are actively seeking out these tools [00:12:57]. Initial reliability and infrastructure problems had to be solved early, as a poor first experience means users “don’t come back” [00:29:38]. As Lorra moved from serving European firms to the largest firms globally, expectations for end-to-end deliverables became incredibly high, leading to immediate responses when issues arise [00:30:10]. The strategy evolved from “ship it as soon as we have something” to working with design partners to ensure products are “thrilled about” before a proper launch [00:30:50].

Product Development and Prioritization

When considering product development and prioritization, a key tension for AI startups is balancing building value for customers today with the understanding that underlying model capabilities are constantly improving [00:13:30]. Max’s philosophy is: if AI labs (like OpenAI, Anthropic, Google) are likely to build and provide a feature as part of their platform, then Lorra should not build it [00:14:24]. This allows them to “build like boats” so that when the “tide rises” (models improve), their functionality gets better without significant deprecation [00:14:38]. For example, if LLMs become capable of directly interpreting playbooks from documents for negotiation, a feature built today might become unnecessary in five years [00:15:08].

The hardest part of building AI products is prioritizing among a hundred high-value features and ensuring they integrate cohesively into a platform [00:17:31]. It’s easy to build a “Frankenstein monster” by chasing every new hot thing [00:17:49]. Planning the platform’s structure and data architecture is crucial, especially as law firms themselves are still figuring out how they want to apply the technology [00:18:08].

Pricing and Value

Lorra currently uses a seat-based pricing model, which is easy to buy and predict [00:19:23]. However, high LLM costs (e.g., a user racking up $10,000 in LLM costs in a week) suggest a future shift to a platform fee with a usage element [00:19:32]. While early expectations were that LLM prices would continuously decrease, newer, better models (like 03 for legal work) are also more expensive [00:19:51]. Lorra is increasingly running classification algorithms and “model pickers” to select the most appropriate model for a task, balancing capability with cost [00:20:29].

Infrastructure and Tooling

A key area of excitement is multi-channel coordination (MCP) or tool calling, where LLMs can access outside tools [00:20:57]. Lorra can provide tools (e.g., for redlining documents), but the significant expansion of possibilities comes when clients can provide their own tools (e.g., client-specific CRM, knowledge databases, or notification systems) [00:21:13].

The “Moat” in AI Applications

The competitive advantage or “moat” in AI applications is evolving. While some initially believed it lay in fine-tuning models, Max believes there’s “no moat at all there” [00:23:18]. Instead, the moat lies in becoming a “system of record” and a collaborative platform, similar to Figma, where designers, PMs, marketers, and clients all collaborate [00:21:49]. The ability to integrate with customer data, external databases (case law, legislation), and existing systems (like Microsoft Word, Outlook, and document management systems) is crucial [00:22:12].

To build a flexible and future-proof system, Lorra uses tools like Brain Trust for thousands of evaluations, making it easy to test new models as they become available [00:23:44]. Every model used undergoes security, privacy, and legal review to comply with data processing agreements, which is critical when working with law firms [00:23:57]. Multi-step workflows have evaluations for each step and for the complete end-to-end product, testing permutations of different models in different places [00:24:14].

Company Building at “Hyper Speed”

Max described the Y Combinator experience during the post-ChatGPT craze as extremely fast-paced, with a clear need to move quickly [00:26:38]. He even took a loan against their YC investment to hire engineers immediately [00:26:24]. While many YC companies focused on building, Lorra was intensely commercially focused from the beginning, launching as soon as they had a minimally viable product (a compliant chat GPT with better Retrieval Augmented Generation (RAG) on specific documents and Swedish legislation) [00:27:44].

“You get one chance… if an attorney comes in and they do a query and it doesn’t work out, then we see them fall off. They don’t come back.” [00:29:10]

The company deliberately made a “mature decision” early on to pause sales for four to five months to refine the product after a significant investment [00:28:29]. This was to ensure reliability and meet high expectations from large firms [00:29:45].

Lorra maintains its fast pace by fostering a culture of urgency, attracting employees who “love winning and hate losing” [00:31:19]. This urgency trickles down from the founders, with Max’s co-founder coding 14 hours a day [00:32:20]. When recruiting, they are upfront that it’s “not a 9 to 5 job” and that they are “here to build for the future,” not maintain [00:32:42].

The evolution of AI startup strategies means new expectations for how quickly a software company can grow [00:35:44]. AI companies are creating entirely new categories, rather than just replacing existing software [00:35:54]. The fastest builder with the highest velocity, best product, and best service will dominate [00:36:09]. Lorra started with enterprise clients on “day one,” spending half their initial angel funding on crucial SOC and ISO certifications, with Max foregoing a salary to afford it [00:36:18]. This contrasted with the traditional path of serving startups, then SMBs, before moving to enterprise [00:36:36]. This rapid growth, from 10 to 100 people in a year, highlights the unique pace of high-growth AI startups [00:34:53].

Future Outlook and Surprises

One surprising aspect of building AI features has been managing user expectations [00:37:48]. Attorneys sometimes expect a simple query like “write me an SPA” to work perfectly, which isn’t always the case [00:37:57]. This necessitates extensive training and onboarding [00:38:13]. Max also changed his mind on product interface, moving from a button and workflow-based system to a chat-based interface that could leverage functions and tools, even if it meant deleting 95% of their original source code [00:38:52].

Looking ahead, Max sees significant disruption opportunities beyond legal, such as Clinical Research Organizations (CROs) in pharma, which are manual, data-rich, and have structured workflows [00:40:17]. For Lorra, multimodal use cases like working with voice and audio transcripts (e.g., transcribing depositions) are exciting [00:41:00].

For aspiring lawyers, the skills that will matter in the future are changing. Instead of “underconfident overachievers” good at following instructions, law firms will need “entrepreneurial creative people” who challenge existing ways and are fluent in working with AI [00:42:30]. Everyone will need to augment their work with AI [00:43:21].

To learn more about Lorra, visit leguora.com [00:43:53].