Snowflake has released Arctic, an open-source large language model (LLM). The data cloud company added that the LLM is available for immediate use through its Snowflake Cortex platform, in addition to model gardens like Hugging Face, Microsoft Azure and Nvidia’s API catalogue. Arctic will be used for serverless inference, said Snowflake, as well as afford developers the ability to build production-grade AI apps at scale using its data cloud. 

“This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI,” said Snowflake’s chief executive, Sridhar Ramaswamy. “By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open-source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.” 

A photo of the Snowflake logo in a magnifying glass, used to illustrate a story about the release of its new LLM Arctic.
Snowflake has launched a new open-sourced LLM for enterprises named ‘Arctic.’ (Photo by Shutterstock)

Snowflake Arctic model utilises several hundred billion parameters

Trained in less than three months on Amazon’s Elastic Compute Cloud P5 instances, Arctic utilises some 480bn parameters, though Snowflake claims that only 17bn of these are activated at a time during the LLM’s operations. This compares favourably to equivalent models in the market, the firm added, with Arctic activating 50% fewer parameters than DBRX and 75% fewer than Meta’s Llama 3 70B model during training or inference. Arctic also outpaces models such as Mixtral-8x 7B in coding, SQL generation and general language understanding, claimed Snowflake. 

Arctic’s release follows Snowflake’s debut of its similarly-named Arctic embed models, available to developers under an Apache 2.0 license as of last week. “These embedding models are optimised to deliver leading retrieval performance at roughly a third of the size of comparable models,” said the firm, “giving organisations a powerful and cost-effective solution when combining proprietary datasets with LLMs as part of a Retrieval Augmented Generation or semantic search service.”

New open-source LLM for the market

Snowflake’s strategic partners welcomed the announcement of Arctic. “We are excited to see Snowflake help enterprises harness the power of open source models, as we did with our recent release of Jamba,” said AI21 Labs’ chief executive Yoav Shoham, referring to his own firm’s latest transformer-SSM model. Hugging Face’s CEO, Clement Delangue, was similarly effusive.

“There has been a massive wave of open-source AI in the past few months,” said Delangue. “We’re excited to see Snowflake contributing significantly with this release not only of the model with an Apache 2.0 license but also with details on how it was trained. It gives the necessary transparency and control for enterprises to build AI and for the field as a whole to break new ground.”

Read more: Microsoft announces phi-3-mini compact LLM