From: redpointai

Redpoint Ventures partners Scott Rainey, Patrick Chase, Alex Bard, and Jacob Efron discussed their approach to investing in the AI landscape at their annual meeting with limited partners [00:00:09]. The discussion covered where value is accruing, the dynamic between startups and incumbents, the impact of massive hardware investments, and the firm’s overall strategy in AI [00:00:12].

Market Overview and Economic Impact

The projected revenue for the NVIDIA data center division, which powers AI training and inference, is 177 billion in capital expenditure on Nvidia chips, the companies spending this money must anticipate revenues of approximately 1.5 trillion by 2032 [02:06]. This projected 1.1 trillion and was built over 50 years [02:47].

Alex Bard believes this projection might be understated, highlighting that making these investments is a strategic imperative for companies, regardless of immediate ROI, to avoid being left behind [04:11]. AI’s shift from making humans incrementally more efficient to “service as software” means it’s doing the job of a human [04:36]. Labor budgets are often an order of magnitude larger than traditional software budgets, presenting significantly larger addressable markets for AI [04:46]. For example, the customer service software market is about 450 billion [04:58]. AI is also expanding into markets historically underserved by software due to size or user sophistication [05:22].

Investment Focus Across AI Layers

Redpoint’s investment strategy considers three main layers of the AI landscape:

1. Model Layer

The value of model companies is increasingly tied to the products built on top of them [06:49]. Building a cutting-edge foundation model enables the creation of unique products that require state-of-the-art models [07:03]. While the cost to enter the state-of-the-art LLM game is prohibitively expensive, adjacent model categories, such as robotics, biology, and material sciences, are of interest due to different data requirements [07:22].

The DeepSeek announcement highlighted that models are becoming cheaper, which benefits application companies by improving margin structures, as inference and training costs drop approximately 10x per year [08:13]. This also showed that scale is not an enduring moat for model companies; they will likely build moats through distribution (moving up the stack with apps and agents) or specialization [08:36]. Switching costs between models are very low, as companies can switch quickly, for example, from Anthropic to DeepSeek, achieving 80-90% cost reductions [09:36].

2. Infrastructure Layer

Historically, Redpoint expected significant opportunities in the infrastructure layer, similar to the cloud wave, which provided compute, storage, and networking infrastructure as a service, along with crucial tooling [10:06]. However, this area has been slow for two main reasons:

  • The model layer is changing rapidly, causing builder patterns to shift at the same speed [10:59].
  • Early AI adopters focused on use case discovery, leveraging the most powerful and recognized models like OpenAI and Anthropic [11:10]. Opportunities are emerging as agents come forth, which could lead to common patterns in how agents access the web and use tools [11:37].

3. Application Tier

The advent of AI has caused an explosion in companies building applications to deliver differentiated experiences [12:19]. A key differentiator for AI applications is the shift in business model from charging for a seat to charging for work done [13:34]. This creates a moment of disruption similar to the early days of SaaS [13:42].

Opportunities exist in:

  • Horizontal Applications: AI-native solutions competing with established players like HubSpot or Salesforce, benefiting from the new business model [13:52].
  • Vertical Markets: A “Cambrian explosion” of hundreds of new vertical AI SaaS businesses are emerging in markets previously underserved by compelling SaaS solutions [15:15].

Competing with Incumbents

Incumbents like Salesforce possess significant advantages including customer relationships, existing workflows, and proprietary data for fine-tuning models [29:47]. However, startups can leverage speed as an attack vector because large companies often have more lawyers than engineers, hindering their ability to rapidly adopt fast-moving underlying models and embrace new business models [31:30]. Incumbents struggle to integrate AI into old databases, infrastructure, and UX, making them vulnerable where AI fundamentally changes workflows [31:51]. For example, AI can eliminate complex logic trees in systems like customer service routing engines by using more data for better decisions [32:51]. The greater the change in workflow, the better the opportunity for startups [33:30]. Markets where the workflow does not change significantly (e.g., AI-native Notion or PowerPoint) are less attractive for startups as incumbents can deliver these features [34:16].

The primary losers in this shift are traditional BPOs and legacy services businesses, as both startups and incumbents are consuming their market share [34:55].

AI valuations are high, with Series A rounds seeing incrementally larger sizes and substantially higher pre-money valuations [35:30]. This is partly due to the belief that AI markets will be larger, accessing labor markets and new verticals [35:58]. AI companies are also growing at an unbelievable pace and can operate much more efficiently, potentially reaching hundreds of millions in revenue with only 20-40 employees [36:15]. This capital efficiency means they might need to raise less future capital, reducing dilution for early investors [36:53].

However, high valuations come with significant challenges:

  • False Positives: Companies can spike quickly, making it hard to distinguish enduring businesses from short-term successes [37:25].
  • Increased Risk: The rapid pace of market change adds inherent risk to these investments [37:46].
  • Fund Construction: Higher prices and faster follow-on rounds impact fund size and ownership [39:26].
  • Corporate Maturity vs. Revenue Maturity: Companies can achieve significant revenue quickly (e.g., $8 million ARR in 8 months with 10 people) but lack the corporate maturity, systems, and processes of a traditional business at that revenue scale [40:11]. This means revenue, traditionally a lagging indicator, can be a leading or even misleading indicator in AI [41:12].

Investment Discipline

To navigate these challenges, Redpoint maintains discipline:

  • Focus on “Tail Opportunity”: Prioritizing end markets with massive potential if successful, rather than smaller, crowded verticals [38:00].
  • Rigorous Due Diligence: Not being swayed by large numbers alone, but focusing on core fundamentals and first-principle diligence [42:42].
  • Co-processing Deals: The firm is increasingly co-processing deals between its early-stage and growth funds, as many companies raising larger first institutional rounds blur traditional stage definitions [40:50].

Key Investment Considerations

  • First Mover Advantage: Speed in achieving market prominence is crucial, as being synonymous with a category can quickly attract customers, partnerships, and capital [23:42].
  • Velocity: The ability to move and innovate quickly is paramount [24:20].
  • Moats in AI Apps: While grand data asset moats are rare, competitive advantages often derive from “a thousand little things” like user experience, product breadth, and overall user experience, similar to traditional SaaS [24:38].
  • Founder Market Fit: At early stages, prioritizing founders with deep experience and unique insight into the long-term market problem over just “great builders” who move fast but lack market experience [22:31].
  • Product Depth: Looking for solutions with significant product depth that are challenging to replicate, often characterized by a higher ratio of workflow to model (e.g., 80% workflow, 20% model) [23:01].
  • Domain Expertise vs. AI Expertise: While being technical enough to understand model trends is important, deep domain expertise is increasingly valued for understanding end-user problems [25:56].
  • Durable Revenue: Distinguishing between experimental budget (companies willing to try new AI products) and securing long-term business line budgets [27:58]. User engagement and usage are key metrics for identifying durable revenue [28:50].
  • Quality Matters: Focusing on industries and use cases where quality is paramount (e.g., regulated industries like healthcare or law), as these markets are less susceptible to a “race to the bottom” on price for 80%-good solutions [17:01].

In conclusion, the AI landscape presents massive opportunities and challenges, requiring venture capitalists to be highly disciplined, analytical, and focused on long-term business fundamentals amidst rapid growth and high valuations [42:55].