From: redpointai

The integration of artificial intelligence (AI) into various industries, particularly the legal sector, is rapidly transforming traditional workflows and creating new paradigms for product development [00:00:27]. This article explores the strategic approaches and challenges faced by companies like Lorra, which applies AI to the law industry, in developing and deploying successful AI-powered solutions.

Initially, developing AI solutions for the legal space was challenging, with early models like Google’s BERT being “horrendously bad” in languages other than English [00:01:46]. The arrival of GPT-3.5 marked a “paradigm starter,” shifting the focus from experimentation to practical implementation [00:01:52].

Today, AI applications are capable of handling end-to-end work deliverables, such as automating due diligence processes where documents are fed into a system like Lorra to find specific information and generate reports [00:02:06]. This evolution is driven not only by better models but also by surrounding frameworks like function calling and tool calling [00:03:00].

Challenges and Opportunities in AI Integration

The legal software space has historically been highly fragmented, with separate tools for translations, document comparisons, searching, and reviewing [00:03:13]. AI is now consolidating these functions, leading to platforms that can address a wide range of needs [00:03:22].

Market Dynamics and Industry Incentives

The legal industry traditionally lacked significant software development, with templating systems being considered cutting-edge [00:04:49]. Law firms often handle one-off, complex projects, while in-house counsels deal with repetitive tasks like NDA reviews [00:05:00]. AI’s broad capability across reviewing, reading, drafting, and researching makes it uniquely suited to serve this entire spectrum [00:05:23].

Law firms are increasingly motivated to adopt AI due to client pressure. Clients, who are themselves going “AI first”, expect their legal partners to leverage these tools for efficiency [00:06:11]. This pressure helps overcome the traditional “hourly billing problem,” where efficiency might seem to reduce billable hours [00:07:03]. Furthermore, there’s significant price pressure and write-offs on routine tasks like due diligence, making automation essential for profitability [00:07:28]. The competitive landscape means if one firm adopts AI for efficiency, others must follow to avoid being seen as inefficient [00:08:00].

Product Development Strategy

Lorra’s product development strategy highlights several key tenets for building a successful AI product:

  • Geographic Expansion: Starting in a smaller, fragmented market like the Nordics allowed Lorra to become a dominant player (“crocodile in a smaller pond”) before expanding globally [00:09:00]. This enabled them to be “enterprise ready” upon entering new markets like the US [00:11:57].
  • “Fast Second Mover” Advantage: By starting slightly later than some competitors, Lorra could observe what was working and what wasn’t [00:09:43]. This led to a strategic decision to focus on the “application layer” and usability, rather than expending resources on training their own large language models (LLMs) [00:10:10].
  • Customer-Centricity: Coming from a non-legal background, Lorra’s founders maintained a humble and attentive approach to client feedback, leading to a platform that addresses broad “wall-to-wall” needs rather than niche point solutions [00:10:24]. This focus on user experience has led to high adoption rates, with 70-80% adoption in deployments, a significant improvement over traditional software [00:13:15].
  • Strategic Product Release: Despite the fast-moving AI landscape, Lorra adopted a methodical approach to product release. They chose to delay sales for several months post-investment to refine the product, understanding that they “get one chance” with attorneys; a poor initial experience could lead to users not returning [00:28:28]. This contrasts with some AI companies making big promises ahead of product readiness [00:28:20].

AI labs like OpenAI, Claude, and Gemini are increasingly functioning as product companies, pushing innovation not only in models but also in how they interact with other software [00:14:03]. Lorra’s strategy is to avoid building features that AI labs are likely to make available as core capabilities in the future. Instead, they focus on building high-value scaffolding and workflows (like “playbooks” for negotiation rules or multi-step instructions) that leverage and enhance the underlying models [00:14:50]. This approach ensures that as models improve, the application’s functionality “just gets better” [00:14:40].

Challenges in Building AI Products

A significant challenge in building AI products is prioritizing among numerous high-value features while ensuring they form a cohesive platform, rather than a “Frankenstein monster” of disconnected functionalities [00:17:40]. This requires careful architectural planning, especially regarding data structures and how different functionalities interact [00:18:08]. Companies are also co-discovering the best ways to apply the technology alongside their clients [00:18:36].

Pricing and Cost Structures

Pricing models for AI applications are evolving. While seat-based models are easy to predict, the fluctuating costs of LLM usage (with better models often being more expensive) suggest a future shift towards a combination of platform fees and usage elements [00:19:37]. Companies are implementing “model pickers” to dynamically choose the most cost-effective model for a given task, optimizing between performance and expense [00:20:29].

AI Infrastructure and Developer Tools

The ability to give AI models access to external tools (“MCP” or function calling) is a crucial aspect of AI infrastructure [00:20:58]. This allows models to interact with existing systems, such as redlining documents or accessing client-specific CRM, knowledge databases, or templates, vastly expanding the universe of what’s possible [00:21:13].

AI Application Moats

While some early AI companies believed fine-tuning models would be a moat, the consensus has shifted towards building the “best system” that can work with various models [00:23:18]. The real moat lies in becoming a “system of record” and a central collaboration platform, much like Figma in design [00:21:50]. This involves deep integration with customer data, outside databases (case law, legislation), and seamlessly fitting into existing workflows, even if that means shifting lawyers away from traditional tools like Microsoft Word and Outlook [00:22:15].

Organizational and Cultural Aspects

Rapid growth in the AI era demands a high-velocity environment [00:26:40]. Lorra fostered this by:

  • Being commercially focused from day one [00:27:46].
  • Recruiting individuals with a strong sense of urgency and a drive to succeed [00:32:16].
  • Maintaining a fully in-office work environment to build momentum and foster a culture where “momentum breeds momentum” and people “love winning and hate losing” [00:34:07].
  • Setting clear expectations during recruitment that it’s “not a 9 to 5 job” but a mission to build for the future [00:32:48].

Future of AI Applications

Overhyped vs. Underhyped

Function/tool calling (MCP) is considered both underhyped for its fundamental capability to create universal applications and overhyped due to widespread discussion without full production implementation due to complexities like authentication [00:37:10].

Surprises in Building AI Features

A key surprise has been the wide range of user expectations, from savvy associates who know how to build workflows to those who expect the AI to “write me an SPA” (Share Purchase Agreement) with a single query [00:37:48]. This necessitates extensive training and onboarding [00:38:13].

Design Evolution

Lorra initially started with a button-and-workflow-based product but pivoted to a chat-based interface, which offered a more flexible platform for adding functions and tools that the chat could utilize [00:39:03]. This involved deleting 95% of their initial source code, a “contrarian move” that ultimately paid off [00:39:46].

Promising AI Use Cases

Beyond legal tech, contract research organizations (CROs) in pharma are identified as a massive disruption opportunity due to their manual processes, large data sets, and structured workflows [00:40:17]. The first fully AI-powered CRO could be highly successful [00:40:42].

Multimodal AI

While currently text-based, Lorra is excited about multimodal AI, particularly working with voice and audio transcripts [00:40:57]. This could allow for voice instructions and processing of audio files from depositions, transforming them into searchable documents [00:41:03].

Future Skills for Lawyers

The skills required for lawyers are shifting dramatically. Instead of “underconfident overachievers” who are good at following instructions, the future demands “entrepreneurial, creative people” who challenge existing methods and are fluent in working with AI [00:42:30]. Lawyers will become “managers of AI agents,” requiring a different skill set than traditional diligence [00:43:06].