From: redpointai

Enterprises embarking on AI adoption face significant considerations regarding governance and data security to ensure safe and trustworthy deployment of AI technologies [00:13:00]. Baris Gencel, Head of AI at Snowflake, highlights the critical steps companies take to integrate AI effectively and securely into their operations [00:00:03].

Establishing Trust and Control

The initial step for many large companies involves their AI governance boards to ensure comfort with the chosen AI platform [00:13:10]. A primary focus is on trust and ensuring that data security and data governance principles are respected, with acceptable policies in place for the company [00:13:18]. Snowflake addresses these concerns by running AI systems directly next to the data within its secure environment, offering reassurances to companies [00:13:35].

Snowflake’s Integrated Approach

Snowflake’s core offering, Cortex, serves as an inference engine that runs large language models, including its own Arctic model and others from providers like Mistral and Meta [00:10:29]. A significant value proposition is the ability to run all models and inference directly inside Snowflake, right next to the customer’s data [00:16:37].

For customers who opt to use external models, Snowflake provides solutions, though this is not the default configuration [00:16:44].

Granular Access Controls

Governance is built from the ground up at Snowflake, allowing customers to set very granular access controls [00:16:59]. This means access to database objects can be precisely granted [00:18:07].

An example given is an HR chatbot where responses must differ based on who asks the question, necessitating very little room for hallucination or data leakage [00:17:11]. For AI applications like search, access controls are deeply integrated, ensuring the search system only provides access to documents a user is authorized to view [00:18:21]. Snowflake leverages its existing access controls for data, so if a user cannot access data, they do not receive it through the AI system [00:18:42]. Many customers spend years setting up their data governance, which then becomes a huge advantage for building AI on top of it [00:18:52].

Addressing Challenges in AI Deployment

Despite the progress, several challenges remain in widespread enterprise AI adoption:

  • Hallucinations: Hallucinations are still a concern [00:28:05]. In regulated industries, these can be catastrophic [00:16:10].
  • Measurement and Quality: Measurement of AI system performance is not yet mature [00:28:11]. Achieving high quality in complex scenarios, such as text-to-SQL, remains a concern [00:28:18].
  • User Comfort: Product innovation is needed to help end-users become comfortable with AI answers that may not always be right, and to provide mechanisms for checking correctness [00:15:49].
  • Unchartered Territory: Agentic systems are emerging, representing new and unchartered territory for businesses [00:28:26].

Evaluation and Observability

To build high-quality production systems, enterprises need robust evaluation platforms [00:13:53]. Snowflake’s acquisition of TruEra, and its open-source product TRU Lens, provides an observability and LLM evaluation platform that helps customers assess system performance at scale, often using LLMs as judges [00:14:10]. This eases concerns about evaluating systems, which is crucial for moving from proofs-of-concept (POCs) to production deployments [00:14:56].

Policy and Guardrails

Companies require reassurances and guardrails on what models can do, especially since responses are unscripted [00:27:29]. This includes ensuring alignment from both a policy perspective and a brand perspective [00:27:37]. Snowflake provides additional guardrails through products like Cortex Guard, which utilizes technologies such as Llama Guard [00:27:14].

While internal use cases for productivity and analysis are transitioning from POCs to production, external use cases still require more confidence before large-scale deployment [00:15:10]. The timeline for widespread internal deployments depends on increasing comfort with the technology and continued product innovation [00:15:40].