From: redpointai
Alexandre Cen, the VP of Generative AI and Sensei at Adobe, discussed the company’s approach to integrating AI across its product suite, the development and impact of the Firefly models, and the future of AI in creative workflows [00:00:07]. Cen, who started his career as an entrepreneur and founded Interact (acquired by Adobe 18 years ago), has been instrumental in Adobe’s AI strategy and product development [00:01:10] [00:01:45].
Adobe’s AI Ecosystem
Adobe’s AI initiatives span its three main business units:
- Creative Cloud (e.g., Photoshop, Illustrator, Adobe Express) [00:02:53]
- Document Cloud (e.g., Acrobat) [00:02:59]
- Experience Cloud (e.g., content management, analytics, CRM data hosting) [00:02:59]
Within Creative Cloud, models are trained on the Sensei platform, Adobe’s internal infrastructure, enabling researchers to develop models that are then integrated into products [00:03:32] [00:03:35]. Examples of existing AI-driven tools include:
- Photoshop: Auto-masking, content-aware fill, object detection, and neural filters [00:03:51]. Neural filters, which allow intent-driven editing (e.g., making someone smile or changing their age), were an early step in reinventing the content authoring process using AI [00:12:15].
- Document Cloud (Acrobat): Liquid Mode, which uses AI to Reflow documents for easier mobile reading [00:04:06].
- Experience Cloud: AI capabilities for anomaly detection in analytics and marketing content generation in Adobe Experience Manager (CMS) [00:04:29].
Firefly Suite: The Generative AI Leap
In 2022, Adobe decided to fully embrace larger models, including diffusion models and Large Language Models (LLMs), leading to the development of the Firefly Suite [00:04:52] [00:05:07]. Launched in March, Firefly has already seen tremendous success during its beta phase, with over a billion images generated across platforms [00:05:17] [00:05:26].
Key aspects of Firefly:
- Integration: Available on firefly.adobe.com, deeply integrated into Photoshop (Generative Fill), Adobe Express, and upcoming integrations in Illustrator [00:05:31].
- Purpose: Designed to make customers more productive and creative [00:05:58].
- Workflow Impact: Adobe believes most content will be “touched” by AI through editing or generation, with creative professionals remaining in control of the process and content curation [00:09:12] [00:09:21].
User Experience Design
Adobe emphasizes a user-friendly approach to AI integration:
- Prompt-based interfaces: While lowering the entry barrier, Adobe acknowledges the need for more control beyond simple prompts for their professional users [00:12:45] [00:13:09].
- Mixed interaction: The ideal interaction is a mix of language, pointing, and sliders [00:13:19].
- Generative Fill in Photoshop: Users select an area and can describe what they want to happen (remove, insert, retouch) [00:15:36]. The contextual bar appears near the selection, and without a prompt, the rest of the image can be used as an implied prompt [00:22:50].
- Addressing cold start:
- For consumers (firefly.adobe.com): An inspiration stream allows users to remix existing prompts and creations, providing a starting point [00:18:56].
- Adobe Style Engine: This feature allows users to select styles, light conditions, and composition rules via curated options, simplifying the creation of specific aesthetics without complex prompts [00:20:00]. This preserves the scene’s structure while changing its style [00:21:07].
- For communicators/marketers (Adobe Express): Generative capabilities are baked into templates, enabling quick social content creation [00:21:50].
- For enterprises: AI provides contextual recommendations for content generation (e.g., text for a blog post) while respecting brand voice and compliance [00:23:41] [00:24:52].
Trust, Safety, and Bias Reduction
Adobe prioritizes trust and safety, aiming to minimize bias and harmful content in its generated output [00:26:19].
- Training Data: Firefly models are trained exclusively on content from Adobe Stock, which undergoes a rigorous automated and manual curation process to remove harmful content [00:26:32] [00:26:41]. This approach reduces risks of IP infringement and harmful content compared to internet-scraped data [00:27:10].
- Bias Mitigation: While stock data inherently contains some bias, Adobe has invested significantly in debiasing techniques [00:28:02]. They developed a “person detector” model to identify job references in prompts and then introduce a fair distribution of skin tones, genders, and age groups based on the request’s country of origin [00:29:19] [00:30:02].
- Harmful Content Filters: Toxicity detector models, deny lists, block lists, and NSFW (Not Safe for Work) filters are implemented to prevent inappropriate content generation, especially concerning children [00:30:45] [00:31:31].
Customer Feedback and Model Refinement
Adobe uses customer feedback to continuously refine its models:
- Firefly.com as RHF System: The firefly.adobe.com community engagement website serves as a Reinforcement Learning from Human Feedback (RLHF) system [00:34:00] [00:35:02].
- Data Collection: Adobe’s terms of use for Firefly.com allow them to store prompts and generated images for training [00:34:01].
- Feedback Signals: Explicit signals (like/dislike, reports) and implicit signals (download, save, share) are collected and logged [00:34:18]. This data will be used to teach future Firefly models to generate content users prefer and avoid disliked content [00:34:40].
- Internal Testing: Prior to public launch, extensive internal testing with Adobe employees helped identify and address biases and issues [00:28:29].
Scaling and Organizational Structure
Deploying AI at scale required significant organizational and technological effort:
- Long-term Vision: The concept of “Gen Shop” (Generative Photoshop) originated in Adobe Research as early as 2019, initially using GANs (Generative Adversarial Networks) before the advent of diffusion models [00:37:11].
- Neural Filters Team: This team was created to accelerate the productization and tech transfer of AI research into Photoshop, building expertise in distributed model training and optimized inference [00:37:35] [00:38:10].
- Horizontal and Vertical Structure:
- Horizontal: Adobe Research (developing models), the Sensei team (providing distributed training platform and curated data sets), Applied Science and Machine Learning Group (refining models), and Gen Services team (creating efficient APIs for models) [00:41:40] [00:42:19] [00:42:50] [00:43:09].
- Vertical: “Tech transfer subgroups” within each product line (Photoshop, Adobe Express, Illustrator) utilize these horizontal APIs to build compelling user experiences, understanding the core workflows that benefit most from AI [00:44:17] [00:44:59]. This “AI Super Highway” approach enables rapid innovation and deployment [00:45:03].
- Technological Stack: Adobe leverages open-source technologies like PyTorch 2 and various inference optimization techniques (distillation, pruning, quantization) to manage costs and achieve fast response times [00:43:25] [00:47:51].
Future Vision for Creative Cloud
Alexandre Cen envisions software companies evolving into AI companies, where AI becomes an integral part of the software stack [00:49:11]. For Creative Cloud, the future involves:
- Hyper-Personalization: AI will enable both large and small enterprises to generate infinite, personalized content for individual viewers or segments, addressing the “content velocity” or “content supply chain” problem [00:50:29] [00:50:53].
- Shift in Creative Roles: Customers will transition from visual designers to “creative directors” of meta-experiences that are hyper-personalizable [00:51:25].
- Fluid Human-Computer Interaction: The creative experience will become more intuitive, less convoluted, and accessible to broader audiences [00:51:51].
Ultimately, Adobe aims for a more accessible Creative Suite powered by AI, enabling content creation with significantly higher creativity and velocity [00:52:05] [00:52:14].