From: redpointai
The advent of AI, particularly large language models (LLMs) like ChatGPT, has dramatically reshaped the landscape for businesses, pushing companies to rethink their strategies and product development. Intercom, a customer support platform, exemplifies this rapid adaptation, swiftly rolling out AI-powered products like Fin within months of ChatGPT’s release [00:00:10]. This shift highlights a critical turning point for how AI will integrate into and transform core business functions.
Rapid AI Adoption: Intercom’s Case Study
Intercom’s journey into aggressive AI adoption began immediately after ChatGPT’s launch [00:00:32]. The company’s AI/ML team in Dublin, Ireland, quickly recognized the profound implications of conversational AI, which excels at fact-finding, understanding, and summarizing information – core tasks of customer support representatives [00:02:16]. This led to a strategic decision to “rip up the entire AI/ML road map” and go “all in” on this new technology [00:01:46].
The timeline of Intercom’s initial AI product rollout was notably swift:
- Before Christmas (2022): Shipped initial AI features [00:01:58].
- January (2023): Had a “reasonable release” [00:02:02].
- March (2023): Launched Fin, their user-facing chatbot, initially [00:02:03].
- July (2023): Broadly launched Fin [00:02:06].
This rapid pace was driven by the understanding that if Intercom didn’t lead in integrating AI for customer support, another company would, potentially rendering traditional support models obsolete [00:02:40].
Incorporating AI into Intercom Products
Intercom’s approach to integrating AI followed a “crawl, walk, run” strategy [00:00:22]:
-
“Zero Downside” AI Features (Inbox AI): The first step involved building AI features into their inbox, such as summarizing conversations, translating messages, and expanding/collapsing text [00:03:55]. These features, initially powered by GPT 3.5 Turbo [00:03:43], provided real value for customer support tasks like ticket creation [00:04:20]. The key was to offer optional features where users could choose whether to engage, minimizing risk if the AI wasn’t perfect [00:04:40].
-
User-Facing Chatbot (Fin): The next significant release was Fin, a user-facing chatbot [00:05:34]. This became possible with access to GPT-4 beta, which offered better control over “hallucinations” – instances where the AI generates incorrect or irrelevant information [00:05:20]. Key concerns addressed included ensuring the bot was trustworthy, reliable, and stayed on topic, avoiding controversial or competitor-endorsing opinions [00:05:47]. Fin answers questions based on a high-confidence threshold [00:05:40].
-
Enhanced Inbox AI and Future Developments: Inbox AI was later expanded with features like adjusting response tone to match a company’s brand [00:06:14]. Future plans include further integration of Fin within the inbox and the ability for Fin to answer support emails [00:31:58].
Challenges and Strategies in Enterprise AI Deployment
Deploying AI at scale in enterprises involves several key challenges:
- Guardrails and Hallucination Prevention: The core strategy involves rigorous “torture tests” to identify misbehaviors and ensure desired outcomes [00:06:57]. This includes prompting the model to prioritize specific contexts and resolve conflicts in information [00:08:29]. There’s a trade-off: more constrained models may be less creative and occasionally miss correct answers [00:07:23]. Intercom tests various models (GPT 3.5, GPT-4, Anthropic’s Claude, Llama) against scenarios, evaluating them on trust, cost, reliability, stability, uptime, malleability, and speed [00:09:04].
- Cost Optimization vs. Exploration: While initial ideas like automatically summarizing all 500 million monthly conversations would be prohibitively expensive [00:05:00], Intercom remains primarily in “deep exploration mode” rather than cost optimization [00:11:02]. The focus is on building the best possible product, assuming that technology will generally become cheaper and faster [00:14:43]. Cost optimization will become a priority when models plateau [00:15:12].
- Latency: Speed is a significant challenge, as AI responses can feel slow, akin to “modem internet days” [00:12:27]. On-device LLMs (e.g., Apple, Google’s Gemini) are expected to usher in “instant AI” [00:12:48].
- Missing Tooling: While many companies are building their own solutions, there’s a recognized need for better tooling, particularly in prompt management (versioning, AB testing, etc.) [00:16:11]. Robust infrastructure for AI deployment (e.g., server location for EU compliance) is also crucial [00:17:00].
- Uncertainty in AI/ML Projects: Unlike traditional software development where uncertainty often lies in the design phase, AI/ML projects introduce a second wave of uncertainty: whether a feature is even technically possible [00:23:01]. This requires viewing AI initiatives as a “portfolio of bets” with varying probabilities of success [00:23:39].
Organizational Structure for AI
Intercom has a centralized AI/ML team of 17-20 people with deep domain expertise in building, running, and training models [00:19:36]. This central team enables approximately 150 “regular product engineers” who then build user-facing features on top of the AI team’s capabilities [00:20:20].
This centralized approach is recommended for companies that are either:
- AI-first startups: Working on the bleeding edge of AI and pushing new possibilities [00:21:08].
- Companies whose existence depends entirely on AI: Building new product categories enabled by LLMs [00:21:21].
For companies merely “applying AI” as a minor enhancement, product engineers with some Open AI familiarity might suffice [00:21:47].
Role of AI in Transforming Job Functions and Workflows
AI is poised to transform existing job functions and workflows in business operations:
- Pilot Mode to Broad Rollouts: Many companies begin AI adoption in “pilot mode,” perhaps using it for weekend support or with free users [00:26:33]. As value becomes apparent, companies shift from “AI curious to all in on AI” [00:26:57]. The ultimate goal is to make adoption a “dip your toe” experience rather than a “trustfall” [00:27:09].
- Automation Levels: The percentage of requests handled by AI will vary by industry and product complexity [00:33:17].
- High Automation (e.g., E-commerce): Simple product types with limited, common queries (e.g., “where is my order,” “refunds”) could see nearly 100% AI automation [00:33:42].
- Lower Automation (e.g., Complex Software): Products like Google Docs, with thousands of diverse questions, may reach 80-90% automation, but not 100% [00:34:02].
- Beyond Textual Answers (AI Actions): Future applications of AI in workplace automation extends beyond text answers to AI performing actions (e.g., logging into Stripe to issue a refund) [00:35:06]. This requires significant software development for authentication, monitoring, and data logging [00:38:05].
- Human-in-the-Loop: Some businesses may opt for a “human-in-the-loop” model where AI proposes actions, but a human manager approves or modifies them, similar to a self-service checkout needing ID verification [00:36:23]. This allows for human judgment on complex cases or customer history [00:36:43].
AI and the Future of Work: Shifting Product Landscapes
AI is fundamentally changing the product landscape, creating both threats and opportunities.
For Startups
Startups should target areas where the incumbent technology stack is “irrelevant” [00:39:00]. This means finding industries where:
- An incumbent, if starting today, would build their product “entirely differently” with AI at its core, making existing UI and features obsolete [00:41:43].
- The fundamental value proposition is so transformed by AI that incumbent advantages (e.g., email sending infrastructure) become secondary to the AI-driven core [00:40:33].
An example of a potentially disrupted product is an advertising optimization startup where AI could autonomously generate, deploy, measure, and optimize ads, leaving minimal human interaction [00:39:15].
For Incumbents
Larger companies need a systematic approach to AI integration [00:42:05]:
- Find Asymmetric Upside: Start with simple, low-risk AI features to build expertise (e.g., automatically generating project titles) [00:42:19].
- Workflow Analysis: Break down the entire product into individual workflows (e.g., creating a project, assigning tasks) [00:42:40].
- Automate and Delete: If AI can reliably perform a workflow above customer accuracy thresholds, AI must do it, and the old manual components should be removed [00:43:04]. This is what startups will do.
- Augment and Reduce: If AI cannot fully remove a workflow, it should augment it or reduce it to a simple decision set, massively increasing efficiency [00:43:32].
- Sprinkle AI: Lastly, add “salt and pepper” AI features for a complete offering [00:43:51].
- Educate Customers: Learn how to effectively sell and explain the value of these AI transformations to customers [00:44:12].
Overhyped and Underhyped Aspects of AI
- Overhyped: Productivity tools focused on generating content like emails or sales pitches [00:44:35]. The belief is that people will learn to detect AI-generated content, and filters will become more sophisticated, rendering this approach less effective long-term [00:44:42].
- Underhyped: The transformative changes coming to creativity [00:44:59]. Just as Instagram’s filters empowered everyone to feel like photographers [00:45:02], tools like Midjourney, Refusion (for sound), and Synthesia (for video) are enabling new forms of creativity that are yet to be fully understood [00:45:22].
Industry Adoption and Future Outlook
Some incumbents have impressed with their AI adoption:
- Adobe was quick to integrate AI, which was crucial for their position [00:46:11].
- Figma and Miro have successfully found useful and valuable AI use cases [00:46:21].
Conversely, Apple and Amazon have been surprisingly slow to fully adopt AI, especially considering their existing voice assistants like Siri and Alexa [00:46:48]. The hope for 2024 is a “leveling out” where widely adopted consumer products normalize advanced AI interactions, similar to how the iPhone normalized modern software design [00:47:43]. This will make AI adoption a competitive battleground, with consumers expecting AI-enriched software [00:49:53].
The strategic decision to go “all in” on AI, as demonstrated by Intercom, is seen as crucial for gaining a first-mover advantage [00:49:19]. This contrasts with companies that only hire one person to “think about” AI [00:49:13]. The shift from exploration to cost optimization in AI models is still ongoing [00:52:51], but the increasing power and decreasing cost of models are enabling previously impossible features [00:53:36]. As AI continues to evolve, understanding its capabilities and applying it iteratively to core business functions will be key for companies to thrive [00:53:57].