Chat Icon
Build Your Company. We’ll Build Your Software. Let’s Talk
Right arrow

AI for Business Gets a Focused Champion

May 8, 2024

Raqib Rasheed
Software Engineer

AI for Business Gets a Focused Champion: Snowflake Launches Arctic LLM

AI buzz has been zooming around for more than two years now – specifically since the first ChatGPT release. But admit it, the real question is - after all the cake and watermelon, will it actually help businesses?

Large Language Model APIs have largely changed the way businesses are done. Even without dedicated enterprise-tailored LLMs, the results and progress have been amazing. But what if LLMs are designed exclusively for business insights and decision-making? And that too customized for every enterprise?

Meet Arctic LLM, by Snowflake – claiming to provide top-tier enterprise intelligence at an incredibly low training cost! Everything else you are going to read about Arctic will blow your mind because:

  1. It is one of the first LLMs designed exclusively to take up enterprise-grade applications.
  2. The LLM is completely open-source and released under an Apache 2.0 license. Not only is the model open, but AI researchers at Snowflake have also divulged an entire document on the development process of the Arctic with the public – taking openness and transparency to ‘AI’ levels!
  3. AI development is causing a large surge in spending for companies as a massive amount of data is required to train these AI models for business specifics. The Arctic is designed in the Mixture of Experts (MoE) architecture taking down training costs significantly! And it was created in just 3 months!!
  4. Snowflake is a data cloud vendor. Using their own LLM hikes security factor by leaps.

1. What Enterprise-Grade Services?

"LLMs in the market do well with world knowledge, but our customers want LLMs to do well with enterprise knowledge," notes Baris Gultekin, Snowflake's head of product for AI.

Arctic was designed exclusively to be particularly good at enterprise applications such as SQL code generation and instruction following, and the creators stresses that it was built to serve the needs of businesses rather than the public.

Proprietary models such as ChatGPT, Google Gemini, or other open-source models are trained on public data. Complex questions about historical events? These models can generate an entire thesis. But ask if an individual business' sales are trending up or down – it may have no idea!

To assimilate specific details about a business and to accurately make informed decisions and responses - models must be trained using that particular business' proprietary data. This can be done by fine-tuning. With specific input, these generative AI models can undertake better decision-making and improved efficiency. Now the Arctic is a choice to do exactly this.

2. A Maverick's Level of Openness and Transparency

Along with the enterprise-customizability of Arctic, something else caught the industry’s imagination with the launch – Snowflake’s commitment to openness!

Along with the Apache 2.0 license, the team also shared details of the three-month-long research leading to the development of Arctic, breaking walls set up against transparency and collaboration in enterprise-grade AI technology.

Most AI decision-makers and global entrepreneurs today milks on open-source LLMs for their organization’s AI strategy. But Snowflake Arctic is not only about accessibility but also about collaboration and compatibility. (Read more on this in the last section)

3. Cost of Training AI Models Slashed!

Even before the onset of AI and LLMs, cloud computing itself had led to a surge in computing costs in recent years. And now AI development is adding to it, at almost a similar volume.

Massive amounts of data are usually needed to train AI models, without which these models can produce undesired results. This is even more in cases of generative AI models, which can lead to significant harm to enterprises and their reputations.

Snowflake aims to reduce this training cost with Arctic. Built in less than three months, the development of the model itself incurred significantly lower training costs (almost one-eighth) compared to contemporary models.

The model was built in a Mixture-of-Experts (MoE) architecture which improves not only performance efficiency but also cost effectiveness. The Arctic activates only a fraction of its parameters while performing model training, unlike other leading models.

This means that now training custom models following individual business specifics can be done in a much affordable way.

Meanwhile regarding performance, according to benchmark testing done by Snowflake itself, Arctic has already surpassed several industry benchmarks in SQL code generation, instruction following, performing math, and applying common sense and knowledge.


4.   LLM, Data Cloud, Collaboration – All in One Environment

This release of a dedicated LLM by the data cloud vendor is part of a global trend – that of platform vendors trying to build out AI capabilities on their own. The Arctic is launching just under a month after Databricks, Snowflake's competitor, launched DBRX, an open-source large language model (LLM) also aimed at helping to make business decisions.

Though initially most vendors partnered with other LLMs, now most vendors are trying to provide their own LLMs. Data cloud vendors now strive to provide customers with an assortment of tools to manage and analyze data, including environments to build AI models and applications.

Beyond that, using an LLM that exists within the same environment as that of data storage, companies no longer have to move their data in and out of the Snowflake environment which always poses the risk of a data breach.

Thus, the Arctic also provides a huge security advantage for Snowflake customers.

Before the Arctic, Snowflake had tied up with Mistral AI and other LLMs such as Llama 2 and Document AI, through Snowflake's Cortex. These services will continue to be provided to the customers and won't be discontinued due to the launch of Arctic as some LLMs are better at certain tasks than others.


Snowflake Cortex – Collaboration and Compatibility!!

Developers can quickly deploy and customize Arctic using their preferred frameworks as Snowflake offers code templates, flexible inference, and training options.

Snowflake Cortex, the company’s platform for building and deploying ML and AI apps and solutions provides Arctic for serverless inference in the Data Cloud. It will be available on Amazon Web Services (AWS) soon.

Arctic embed too is included in the Snowflake Arctic family of models which is a family of state-of-the-art text embedding models and is open-source. It is available on Hugging Face for immediate use and will soon be available as part of the Snowflake Cortex embed function. These models are said to deliver leading retrieval performance at roughly a third of the size of comparable models, making RAG setups cost-effective for organizations.

Related Posts

Build Your Company.

We’ll Build Your Software.

Let’s Work Together