Snowflake adds AI & ML Studio, new chatbot features to Cortex

By Anirban Ghoshal

Cloud-based data warehouse company Snowflake is adding more large language model (LLM) capabilities and services related to generative AI to its fully-managed AI service Cortex.

Cortex, showcased last year in November and is part of the company’s AI Data Cloud platform, provides enterprises with the building blocks to use LLMs and AI without requiring expertise in managing complex GPU-based infrastructure.

The updates — announced at the company’s annual Snowflake Summit — include a new AI and machine learning (ML) studio and other chatbot capabilities such as Cortex Analyst and Cortex Search.

No code Cortex Playground to bring enterprise data to LLMs

Cortex AI & ML Studio, also dubbed as Cortex Playground, is a no-code interface within Cortex that allows enterprises to bring their enterprise data to LLMs from providers such as Google, Meta, Mistral, Reka, and Snowflake’s Arctic.

Cortex Playground, which is currently in private preview, is expected to help enterprises accelerate the development of AI applications and provide an interface to find the best yet cost-effective LLM for a specific use case, the company said.

However, according to analysts, the move to launch Cortex Playground is a strategy to tackle many challenges, including playing catchup with rival Databricks.

“Snowflake is introducing the no-code interface now likely due to the increasing pressure from the market and competitors who have already embraced such user-friendly development environments,” said Steven Dickens, vice president at research and advisory services firm, The Futurum Group.

“This move is a reaction to the success of platforms like Databricks, which offer comprehensive AI and machine learning development tools, and to counteract the perception of Snowflake as an overly expensive and complex solution,” Dickens added.

Databricks provides a low-code interface through its collaborative notebooks and integrations with MLflow.

The data lakehouse platform provider forayed into the low-code space three years back when it acquired German startup 8080 Labs to integrate data science tool bamboolib into the lakehouse platform, said Hyoun Park, chief analyst at Amalgam Insights.

Another rival, Oracle, also has been expanding its low-code development capabilities with Oracle APEX, analysts said.

On the other hand, Dickens believes that Cortex Playground might help Snowflake retain some market share and attract users.

In order to further help enterprises enhance LLM performance and deliver personalized experiences, Snowflake has introduced Cortex Fine-Tuning, which is currently in public preview.

Cortex Fine-Tuning can be accessed through Playground or a simple SQL function, the company said, adding that the serverless customization is available for a subset of Meta and Mistral AI models.

Analyst, Search to boost Cortex’s chatbot-building abilities

Snowflake has added additional capabilities — Cortex Analyst and Cortex Search — that boost Cortex’s chatbot development, which in turn will help enterprises build bots that can answer questions about an enterprise’s data, both structured and unstructured, using natural language.

Cortex Analyst, which is built with Meta’s Llama 3 and Mistral Large models, according to the company, allows enterprises to build applications on top of their analytical data in Snowflake.

The move to introduce Cortex Analyst helps Snowflake to keep a more closed ecosystem focused on Snowflake while providing a gateway to open source LLMs, Park said.

However, Dickens pointed out that rivals, such as Databricks, Elastic, Oracle, and Teradata, already offer similar capabilities and the move appears reactionary, driven by the need to counteract criticisms about its high costs and limited flexibility.

Cortex Analyst is expected to enter public preview soon and has been used by customers such as Zoom and Bayer.

Cortex Search, on the other hand, uses Neeva’s retrieval and ranking tech along with the company’s Arctic embed capability to help enterprise users build applications against documents and other text-based datasets through enterprise-grade hybrid search — a combination of both vector and text — as a service.

Snowflake acquired Neeva last year in May to add generative AI-based search to its AI Data Cloud.

Cortex Search, according to Park, will help enterprises bridge gaps between their initial understanding of documents as data and the ability to translate documents into AI capabilities.

“Although application development on top of traditional structured data is quite common, the ability to build apps on top of documents and unstructured data is still not common in the business world,” Park said.

Several Cortex features are to be soon made generally available

The Snowflake Summit this year saw the company promising to move a volley of features and capabilities to general availability soon.

One such feature is Snowflake’s Cortex Guard, which leverages Meta’s Llama Guard — an LLM-based input-output safeguard — which can be used to filter and flag harmful content across enterprise data and assets, such as violence and hate, self-harm, or criminal activities.

Another feature moving to general availability soon is Document AI, showcased in June last year.

Document AI, built on technology from Snowflake’s acquisition of Applica last year, is targeted at helping enterprises make more use of unstructured data, the company said, adding that the new LLM can help enhance enterprise productivity by generating insights from documents.

Snowflake Copilot, a text-to-SQL assistant, will also hit general availability soon, the company said, adding that the Copilot combines Mistral Large with Snowflake’s proprietary SQL generation model to accelerate productivity for every SQL user.

Other updates include Snowflake moving the Model Registry feature, introduced last year in June, to general availability.

Model registry, according to the company, is a unified repository for an enterprise’s machine learning models. It's designed to enable users to centralize the publishing and discovery of models, thereby streamlining collaboration between data scientists and machine learning engineers.

In addition, Snowflake announced the Snowflake Feature Store, an integrated capability targeted at data scientists and ML engineers to help create, store, manage, and serve consistent ML features for model training and inference.

The company said Snowflake Feature Store is currently in public preview, adding that it was releasing another feature dubbed ML Lineage in private preview.

According to Snowflake, ML Lineage will allow enterprise teams to trace the usage of features, datasets, and models across their complete lifecycles.

© Info World