From: redpointai

Intercom, a company deeply rooted in customer support, provides a unique perspective on navigating the rapidly evolving landscape of AI, balancing initial exploration with eventual cost optimization. Dez Trainer, co-founder and Chief Strategy Officer of Intercom, shared insights into their journey, highlighting the strategic decisions made since the launch of ChatGPT [00:00:10].

Intercom’s Immediate AI Shift

Upon ChatGPT’s release, Intercom recognized customer support as a “kill zone” for AI due to large language models’ inherent conversational, fact-finding, and summarization capabilities [00:02:17]. The company swiftly decided to “rip up the entire AI/ML roadmap” to go “all in” on this new technology, shipping their first AI product before Christmas and launching “Finn” in March [00:01:46][00:02:03].

Their initial strategy involved launching “zero downside” AI features into their inbox, such as conversation summarization, message translation, and text expansion/collapse [00:03:39][00:03:52]. These features were designed so that if users didn’t find them useful, they simply wouldn’t click the button [00:04:40].

Early Cost Realizations

This initial exploration quickly brought cost considerations to the forefront [00:04:53]. With Intercom handling 500 million conversations monthly, automatically summarizing all of them would have been prohibitively expensive, potentially bankrupting the company [00:04:56][00:05:05]. This immediately made them realize that certain features, while desirable, were not yet affordable by default [00:10:44].

Current Stance: Deep Exploration Over Cost Optimization

Despite these early cost insights, Intercom remains in a “deep exploration mode” [00:11:03]. The primary reason is the abundance of opportunities to embed AI across their product, from reporting and human agent augmentation to messaging [00:11:32].

While acknowledging that some calls could be moved from more expensive models like GPT-4 to cheaper ones like GPT-3.5 or even open-source models like Llama, this isn’t their immediate focus [00:11:12]. Their strategy prioritizes building the “best possible product” first, assuming that technology will generally become cheaper and faster over time [00:13:53][00:14:43].

Latency as a Forcing Function

For Intercom, latency is a more pressing concern than raw cost. The current feel of AI interactions can be like “modem internet days,” and improving speed is a top priority [00:12:27][00:13:20]. The expectation is that AI, especially with on-device models from companies like Apple and Google, will become instantaneous [00:12:44][00:12:55].

When to Shift to Cost Optimization

Intercom anticipates a shift towards cost optimization and refining existing solutions once the underlying AI models begin to “plateau,” meaning incremental improvements become less significant (e.g., GPT-7 is only marginally better than GPT-6) [00:15:12]. This is when it will make sense to optimize what they have rather than constantly reimagining the solution [00:15:22].

Challenges and Future of AI Adoption

The “Portfolio of Bets” Approach

Developing with AI involves a unique challenge: the uncertainty of whether a feature is even possible [00:23:05]. Unlike traditional software, where uncertainty is largely in the design phase, AI projects can involve a “second wave” of uncertainty where the technical feasibility is unknown even after design [00:23:01]. Therefore, AI development needs to be treated as a “portfolio of bets,” with some having high probability (e.g., text expansion) and others being lower probability but high-impact (e.g., generating complex graphics) [00:23:39][00:23:51].

Phased Rollout and Customer Adoption

Intercom’s approach to rolling out AI features to customers emphasizes a “crawl, walk, run” strategy, allowing clients to “dip their toe in” rather than a “trust fall” [00:27:09]. They enable piloting features for specific user segments (e.g., free users, weekend support) to demonstrate value and build trust [00:27:21]. This often leads to customers rapidly expanding AI usage once they see the benefits, sometimes even realizing their free users are getting better support than paid ones due to instant answers [00:28:09][00:28:11].

The widespread adoption of AI by major consumer platforms like Apple and Google is expected to normalize the concept of “talking to software,” making customers more accepting of AI-enriched applications [00:30:29][00:30:35].

Finetuning vs. RAG

Currently, Intercom’s AI products, like Finn, primarily use Retrieval-Augmented Generation (RAG) rather than extensive finetuning per customer [00:31:32]. While they do finetune for tone of voice, this is achieved through prompting [00:31:37]. Finn naturally picks up the tone of voice by reading customer documentation and support conversations [00:32:50].

Automation Levels and Future Capabilities

The percentage of requests handled by AI will vary significantly by industry vertical [00:33:25]. Simple e-commerce queries might see 100% automation, while more complex products like Google Docs might only reach 80-90% due to diverse query types [00:33:52][00:34:06].

A significant future development area for AI in customer support is enabling AI to take actions (e.g., issuing refunds, canceling orders) rather than just providing text answers [00:35:08][00:35:23]. This requires substantial software development, including authentication, monitoring, and data logging [00:38:05]. Some customers will embrace full automation, while others may prefer a human oversight model, where AI proposes actions for a manager’s approval [00:35:58][00:36:23].

Strategic Lessons for Companies

For Startups

Startups entering the AI space should identify areas where the incumbent technology stack is largely irrelevant [00:41:34]. This means finding product areas where, if rebuilt today, the entire system—features, UI, and underlying architecture—would be “entirely differently” designed because of AI capabilities [00:41:46][00:42:01]. An example of an entire product category being replaced is advertising optimization [00:39:34].

For Incumbents

Larger companies should adopt a structured approach to integrate AI [00:43:54]:

  1. Find Asymmetric Upside: Start with simple AI features that offer quick wins and help the organization learn about costs and latency [00:42:19].
  2. Workflow Analysis: Break down the entire product into individual workflows [00:42:38].
  3. Automate or Delete: If AI can reliably and accurately perform an entire workflow, let AI do it and remove the old process [00:43:04].
  4. Augment or Simplify: If AI cannot fully remove a workflow, but can augment it or reduce it to a simple decision, implement that [00:43:32].
  5. Sprinkle AI: For other areas, “sprinkle” AI elements to ensure the offering looks comprehensive [00:43:51].
  6. Sell the Value: Crucially, learn how to communicate and sell the value of these AI enhancements to customers [00:44:12].

Overhyped vs. Underhyped AI Areas

Dez Trainer identified the following:

  • Overhyped: AI productivity tools focused on generating content like emails or sales pitches [00:44:35]. He believes people will learn to detect AI-generated content, and the focus on “good writing” will return [00:44:42].
  • Underhyped: The transformative impact of AI on creativity [00:44:56]. Referencing Instagram’s filters democratizing photography, he highlights tools like Kaiber, Refusion (for sound), and Synthesia (for video) as ushering in new forms of creativity that are yet to be fully understood [00:45:01][00:45:22][00:45:27].

Impressive and Disappointing Incumbents

Beyond Microsoft and Google, Adobe, Figma, and Miro were cited as impressive incumbents for quickly integrating useful AI features [00:46:11][00:46:21]. On the other hand, Apple and Amazon were noted for their surprisingly slow adoption of advanced AI capabilities in products like Siri and Alexa, which currently feel primitive compared to tools like ChatGPT [00:46:48][00:46:50][00:47:30]. Dez hopes for a “leveling out” in 2024 as these major players inevitably integrate more sophisticated AI [00:47:44].