Snowflake Summit 2024 - Snowflake refines its platform to help customers exploit emerging AI use cases (2024)

Snowflake Summit 2024 - Snowflake refines its platform to help customers exploit emerging AI use cases (1)


Snowflake has announced a slew of new capabilities for its AI Data Cloud that it hopes digital and business leaders will use to make the most of their enterprise information.

Across two keynote sessions at Snowflake Summit 2024 in San Francisco, CEO Sridhar Ramaswamy and his executive colleagues outlined a range of new features for the company’s cloud-based platform. Ramaswamy, who was appointed CEO in February, explained to conference delegates how his company is developing its technology to help businesses exploit Artificial Intelligence (AI) and Machine Learning, saying:

The AI Data Cloud is the platform for enterprise AI. It makes AI easy, efficient and trusted. We’ve permanently accelerated our pace of delivery. Whether it's in generative AI or MLOps, you know that every one of our innovations is deeply integrated into our platform.

At a briefing before the start of the event, Ramaswamy explained how he hopes the features the company is releasing – from chat experiences to AI models – will help CIOs use Snowflake as an integrated platform for business-focused innovation:

We created Snowflake to be a single, unified, holistic product that is easy to use and gets the job done. We've evolved this into a data cloud, a cloud platform, but one that’s specifically designed for data. One way to place the announcements of this Summit is in terms of innovations in the core of the data, and there are also lots of innovations in the layer outside, which is about collaboration.

Embracing AI innovations

Ramaswamy emphasized how more enterprise customers are eager to exploit emerging technology. At the Summit, Snowflake announced enhancements to Cortex AI, the company’s Large Language Model (LLM) and vector search service, and Snowflake ML, an integrated set of capabilities for end-to-end machine learning.

The new features in Cortex AI include chat experiences that should help organisations develop chatbots quickly. Snowflake unveiled two new chat capabilities, Snowflake Cortex Analyst and Snowflake Cortex Search, which will allow users to build chatbots that work with structured and unstructured data, explained Ramaswamy:

For the past year, I've met a lot of our customers. And the number one topic that comes up in these meetings is how we can help them with AI. AI is opening up enormous possibilities because, for the first time, every person or organization can talk to their data using natural language. And in just a few years, the new normal is going to be that we're going to be able to tell every app what we want.

Cortex Analyst, built with Meta’s Llama 3 and Mistral Large models, allows businesses to build applications securely on top of their analytical data in Snowflake. Cortex Search harnesses retrieval and ranking technology from Neeva to help customers draw on text-based datasets. Ramaswamy said that Snowflake wants to help people build applications from a broad range of assets:

Our new products are about data, collaboration, AI, and applications. We're making it easy for partners and customers to develop applications on top of Snowflake and make us truly a platform. These are the things that we are incredibly excited about.

Enhanced data foundations

Snowflake also announced advancements to its AI Data Cloud platform. The company wants to make it easier for customers to work with data, models, and applications, including using Apache Iceberg, the open-source format for analytic tables, to implement open and flexible architectural patterns.

The move towards Apache Iceberg Tables comes alongside the introduction of Polaris Catalog, a vendor-neutral and open catalog for Apache Iceberg. Polaris Catalog enables cross-engine interoperability. Organisations can run the Polaris Catalog in Snowflake’s AI Data Cloud or self-host in their infrastructure using containers, said Ramaswamy:

I'm delighted to make this commitment to open catalogs to deliver interoperability with AWS, Google Cloud, Azure, and many other industry-leading providers.

As part of its support for AI workloads, Snowflake is also refining Snowflake Horizon, the company’s unified set of compliance, security and access capabilities. The enhancements should ensure enterprises can use internal and third-party sourced content safely. Snowflake co-founder Benoît Dageville said during the keynote sessions that security is key to making the most of emerging technology:

Unified security and governance at every layer is critical. We are talking about AI for your enterprise, not AI for planning your next vacation.

Supporting new use cases

For companies looking to build on the Snowflake data platform, the company announced a new collaboration with Nvidia that customers and partners can harness to build customized AI applications. The company hopes this collaboration will help businesses to support bespoke use cases for AI, said Ramaswamy:

Together, Snowflake and Nvidia are helping enterprises turn generative AI from an aspiration into a reality. Nvidia is underpinning many of our partner solutions in Snowflake. We are collaborating to make data processing both cheaper and faster.

Ramaswamy introduced Jensen Huang, founder and CEO of Nvidia, who spoke to the conference via a video link. It was announced that Snowflake has adopted Nvidia AI Enterprise software to integrate NeMo Retriever micro-services into Snowflake Cortex AI. Huang said integration will help organisations connect custom models to diverse business data for accurate responses.

The most important asset of a company is its proprietary data and this sits data on Snowflake. We can now connect extremely large and proprietary data directly to your ability to chat with it. And, of course, once you can chat with it, you can connect to a whole bunch of other mobile services and AIs.

It was also announced that the enterprise-grade LLM Snowflake Arctic is now fully supported with NVIDIA TensorRT-LLM software. Arctic is available as an NVIDIA NIM inference micro-service, a set of pre-built AI containers that allows developers to access the Arctic model in Snowflake. Huang said companies should use these innovations to embrace change:

AI is moving at a much faster rate than Moore’s Law. The pace isn’t increasing twice every two years, but twice every six months. It is clear now that we're likely to move even faster. You want to take your most important business processes, all of the most important things that you do, and connect that effort into a flywheel so that you will capture all that new data.

Snowflake Summit 2024 - Snowflake refines its platform to help customers exploit emerging AI use cases (2024)

References

Top Articles
Latest Posts
Article information

Author: Foster Heidenreich CPA

Last Updated:

Views: 5791

Rating: 4.6 / 5 (76 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Foster Heidenreich CPA

Birthday: 1995-01-14

Address: 55021 Usha Garden, North Larisa, DE 19209

Phone: +6812240846623

Job: Corporate Healthcare Strategist

Hobby: Singing, Listening to music, Rafting, LARPing, Gardening, Quilting, Rappelling

Introduction: My name is Foster Heidenreich CPA, I am a delightful, quaint, glorious, quaint, faithful, enchanting, fine person who loves writing and wants to share my knowledge and understanding with you.