
FILE PHOTO: Cloud computing company Snowflake has announced its own large language model, Arctic.
| Photo Credit: Reuters
Cloud computing company Snowflake has announced its own large language model, Arctic, today. Touted as an enterprise-grade, open LLM, the model has been trained to take on complex enterprise workloads such as SQL generation, code generation and instruction following.
In a press release that announced the news, the company shared that it used the “mixture-of-experts” architecture to lead at benchmark tasks for enterprises. The company has claimed that Arctic LLM’s performance is close to other open models from companies like Meta, Mistral and Databricks at tasks based on common sense, reasoning, general knowledge and maths.
Snowflake will also be releasing Arctic’s weights under an Apache 2.0 license and details of its training process to live up to the open source tag.
The flagship model is a part of the family of generative AI models called Arctic, and took around three months, 1,000 GPUs and $2 million to train, reportedly. It is also a sign that Snowflake is pushing ahead to compete with rival Databricks’ DBRX, another generative AI which was released recently.
(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)
“This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI,” said Sridhar Ramaswamy, CEO, Snowflake. “By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”
Snowflake has also pledged to make Arctic LLM available across a range of platforms like Hugging Face, Microsoft Axure, Together AI’s model hosting service and enterprise generative AI platform Lamini. However, first it will be available on Cortex, Snowflake’s platform for building and deploying AI apps and services.