From: gregisenberg
The concept of Micro-Capability Protocols (MCPs) has garnered significant attention, prompting discussions about its meaning and the startup opportunities it presents [00:00:00]. Professor Ross Mike explains MCP as a crucial evolution for making Large Language Models (LLMs) more capable [00:01:18].
Understanding MCP
In programming, standards are vital for engineers to build systems that communicate effectively [00:01:52]. A well-known standard is REST APIs, which companies follow to structure their services for easy connection [00:02:09].
Limitations of LLMs Alone
LLMs, by themselves, are incapable of performing meaningful actions like sending an email [00:02:42]. Their primary function is predicting the next text, such as completing “My Big Fat Greek” with “Wedding” based on training data [00:03:25].
Evolution: LLMs with Tools
The next phase involved developers combining LLMs with external tools, such as APIs, enabling capabilities like searching the internet (e.g., Perplexity) [00:03:44]. This allows LLMs to become more powerful by accessing outside services [00:04:21]. For example, an LLM could be set up to create a spreadsheet entry every time an email is received, using automation services like Zapier or End8 [00:04:42].
However, connecting multiple tools to an LLM to build a sophisticated assistant (like an “Iron Man level Jarvis”) becomes frustrating and cumbersome [00:05:09]. Stacking these tools cohesively is a significant challenge, making LLMs, despite their cool factor, “very, very dumb” without proper integration [00:05:38]. Different service providers construct their APIs differently, making it feel like “gluing a bunch of different things together,” which is difficult to scale [00:08:10]. Even minor updates to an API can break complex, step-by-step automations, causing significant engineering nightmares [00:10:18].
MCP as the Solution
MCP introduces a translation layer between the LLM and various services/tools [00:08:38]. This layer unifies the “different languages” of various tools into a single, comprehensive language that the LLM understands, simplifying connection and access to external resources [00:08:45]. This eliminates much of the manual work and step-by-step planning required in the previous evolution, reducing edge cases where systems can fail [00:09:36]. MCP unifies the LLM and the service, creating an efficient communication layer [00:10:45].
The MCP Ecosystem
The MCP ecosystem consists of four main components [00:11:02]:
- MCP Client: The LLM-facing side (e.g., Tempo, Windsurf, Cursor) [00:11:13].
- Protocol: The two-way connection between the client and the server [00:11:29].
- MCP Server: Translates the external service’s capabilities to the client [00:11:34].
- Service: The external tool or data source (e.g., database, API) [00:11:41].
A key aspect of this architecture, designed by Anthropic, is that the MCP server responsibility now falls on the service providers [00:11:51]. This means if a company builds a database and wants LLMs to access it, they are responsible for constructing the MCP server [00:12:03]. This has led many external service providers to build their own MCP servers and repositories [00:12:36]. Essentially, MCP is a standard that companies and engineers are adopting to ensure interoperability and scalability, allowing LLMs to become capable of “important stuff” [00:12:51].
Current Challenges
Despite its promise, there are technical challenges [00:13:50]. Setting up an MCP server can be annoying, involving many local steps like downloading and moving files [00:13:59]. Once these kinks are resolved, or the standard is updated and polished, LLMs will become even more capable [00:14:08].
Startup Opportunities with MCP
The advent of popularized protocols like HTTP or SMTP has historically opened new opportunities for businesses [00:15:51]. While it’s still early days for MCP, understanding it is crucial for future ventures [00:19:28].
For Technical Founders
- MCP App Store: An idea for a technical founder is to create an “MCP App Store” [00:16:38]. This platform would allow users to browse different MCP servers from repositories, click “install” or “deploy,” receive a specific URL, and then paste that URL into an MCP client [00:16:45].
For Non-Technical Founders
For non-technical individuals, the advice is to stay updated on platforms adopting MCP capability and observe the direction of the standards [00:17:28]. It’s too early for concrete business decisions, as the standard is not fully finalized and might be challenged or updated by entities like Anthropic or OpenAI [00:17:41].
- Strategic Observation: The current phase calls for observation and learning [00:19:11]. When the standard is finalized and service providers fully integrate MCP, seamless integration will become possible [00:17:57]. This will make it significantly easier to integrate LLMs with external tools, akin to stacking “Lego pieces” [00:18:40].
- Focus on User Experience: While building integrated LLM solutions is challenging (requiring significant engineering hours to ensure cohesion, speed, and limit hallucinations), MCP will simplify the integration aspect [00:18:12].
MCP, as a standard for LLMs, makes them more capable by simplifying their connection to external services [00:15:29]. While the technical challenges are being worked out, keeping a close watch on the evolution of this standard is key for future entrepreneurial opportunities [00:19:31].