From: redpointai
Redpoint Ventures, in a discussion featuring partners Scott Rainey, Patrick Chase, Alex Bard, and Jacob Efron, explored the evolving AI landscape, focusing on where value is accruing, the interplay between startups and incumbents, and the firm’s investment strategy in AI [00:00:09]. The conversation highlighted significant challenges and advancements in AI model development, particularly at the foundational “model layer” [00:06:08].
The Model Layer: Brains of AI Applications
The model layer consists of the large language models (LLMs) and other models that serve as the “brains” powering AI applications [00:06:10]. The true value of these model companies is realized through the products built on top of them [00:06:49]. Developing a cutting-edge foundation model enables a select few companies to build unique products, especially if their model offers a significant advantage over the best open-source alternatives [00:07:03].
High Barrier to Entry for General LLMs
The cost to enter the state-of-the-art LLM game is prohibitively expensive [00:07:20]. This high cost limits the number of new general LLM companies that venture capital firms like Redpoint actively consider for investment [00:07:25].
Adjacent Model Categories
Despite the high barriers for general LLMs, there is significant interest in adjacent model categories that require different types of data [00:07:29]. Examples include:
- Robotics: Requires unique data to enable models to interact with the real world [00:07:35].
- Biology and Material Sciences: Anticipated to be interesting areas for model development in the coming years [00:07:46].
Commoditization of AI Models
A significant trend in AI model training and deployment is the increasing commoditization of models, particularly highlighted by developments like Deepseek’s announcement [00:08:03].
Declining Costs
Models are becoming cheaper to run. Costs for inference (using a trained model) and training (creating a model) are dropping approximately 10x per year [00:08:21]. This reduction in cost is beneficial for application companies building on top of these models, as it leads to better margin structures [00:08:27].
For example, many Redpoint portfolio companies switched from Anthropic to Deepseek, experiencing 80% to 90% cost reductions for inference within weeks of Deepseek’s release [00:09:21].
Scale is Not an Enduring Moat
The emergence of highly performant, cheaper models demonstrates that having the biggest GPU cluster or raw scale does not guarantee an enduring competitive advantage in the model layer [00:08:36]. Model companies are now looking to build moats through:
- Distribution: Moving “up the stack” by launching their own applications and agentic tools, as seen with OpenAI’s approach [00:08:53].
- Specialization: Focusing on specific domains or use cases where specialized data and models can create unique value, such as in robotics [00:09:02].
Low Switching Costs
A key challenge in AI deployment for model companies is the remarkably low switching cost for applications [00:09:36]. If a new model emerges with similar or better performance, applications can quickly plug it in, often within days, without needing to rewrite significant code [00:09:39]. This contrasts with cloud migrations, where moving code is a complex undertaking [00:09:43].
Despite the significant investments and apparent barriers to entry in building advanced models, the low switching costs mean that model companies must continually innovate to avoid commoditization and maintain their position [00:09:47]. This highlights a fundamental challenge and strategy in AI model development – the need for ongoing rapid improvement and adaptation.