![translation](https://cdn.durumis.com/common/trans.png)
This is an AI translated post.
Snowflake Launches 'Arctic,' an Enterprise-Grade LLM with Industry-Leading Openness
- Writing language: Korean
- •
-
Base country: All countries
- •
- Information Technology
Select Language
Summarized by durumis AI
- Snowflake has released 'Arctic,' an open-source large language model (LLM) with best-in-class performance and efficiency.
- Arctic is available for free commercial use under the Apache 2.0 license and supports customization through various framework integrations.
- Through Arctic, Snowflake empowers businesses to leverage their data and create practical AI/machine learning applications, providing the data foundation and cutting-edge AI building blocks needed.
Snowflake adds a state-of-the-art, open-source large language model (LLM) to its Arctic model family, delivering top-tier performance and efficiency in its class.
Based on the Apache 2.0 license, the open-source model offers flexibility and customization with support for various frameworks.
Snowflake, a leading global data cloud company, has launched Snowflake Arctic, an enterprise-grade large language model (LLM) with industry-leading openness and performance.
Designed with Snowflake's unique Mixture-of-Experts (MoE) approach, Arctic provides top-tier performance and productivity. It is optimized to handle complex business requirements, meeting the highest standards in various areas such as SQL code generation and command execution.
Arctic is particularly notable for its Apache 2.0 license, which allows for free commercial use. Snowflake has made public detailed information on the AI learning methods, establishing a new open standard for enterprise-grade AI technology. The Arctic LLM is part of the Snowflake Arctic model family, which includes a text embedding model for search use.
"Snowflake's AI research team is at the forefront of innovation in AI, marking a pivotal moment for our company," said Sridhar Ramaswamy, CEO of Snowflake. "By releasing open-source AI with industry-leading performance and efficiency to the AI community, Snowflake is expanding the potential of open-source AI. This also elevates Snowflake's AI capabilities, allowing us to provide customers with capable and trustworthy AI models."
◇ Arctic: An open-source LLM that supports broad collaboration
According to a recent report by Forrester, a market research firm, approximately 46% of AI decision-makers at global companies said they are using existing open-source LLMs to implement generative AI within their organizations as part of their AI strategy. Snowflake's data cloud platform is currently used as a foundation for data by over 9,400 companies and organizations worldwide. These organizations now have access to data through LLMs that offer the highest levels of openness in the industry.
Arctic, an open-source model under the Apache 2.0 license, provides flexibility with code templates and the option to choose inference and learning methods. Users can employ and customize Arctic with popular frameworks preferred by enterprises, including NVIDIA NIM, NVIDIA TensorRT-LLM, vLLM, and Hugging Face. Arctic is available for immediate use through serverless inference in Snowflake Cortex. Snowflake Cortex, a fully managed service, provides machine learning and AI solutions on the data cloud, along with a diverse range of model listings, including Hugging Face, Lamini, Microsoft Azure, NVIDIA API Catalog, Perplexity, and Together AI. Arctic use will also be available on Amazon Web Services (AWS).
◇ Excellent resource efficiency and top-tier performance
Snowflake's AI research team consists of leading researchers and system engineers. Arctic was built in less than three months. Amazon Elastic Compute Cloud (Amazon EC2) P5 instances were used for model training, and training costs were only one-eighth of comparable models. Snowflake is setting a new benchmark for the training speed of advanced, open-source enterprise models, ultimately enabling users to create cost-effective, customized models at optimal scales.
Arctic's differentiated MoE design enhances both learning systems and model performance through meticulously designed data assembly tailored to business requirements. By activating 480 billion parameters 17 at a time, it delivers exceptional token efficiency, industry-leading quality, and optimal outcomes. Arctic activates about 50% fewer parameters than DBRX and about 75% fewer parameters than Llama 3 70B for significant efficiency improvements during inference or training. It surpasses the performance of prominent existing open-source models like DBRX and Mixtral-8x7B in coding (HumanEval+, MBPP+) and SQL generation (Spider) while exhibiting top-tier performance in general language understanding (MMLU, Large Multitask Language Understanding).
◇ AI innovation for everyone, led by Snowflake
Snowflake provides the data foundation and cutting-edge AI building blocks necessary for companies to create practical AI/machine learning applications leveraging their own data. When customers use Arctic through Snowflake Cortex, it becomes easier to build production-ready AI apps at scale within the security and governance scope of the data cloud.
The Snowflake Arctic model family, which includes the Arctic LLM, also encompasses Arctic embed, one of Snowflake's recently released state-of-the-art text embedding models. It is open-source under the Apache 2.0 license and free to use within the open-source community. The family consists of five models, available directly on Hugging Face and will be available in private preview built into Snowflake Cortex. This embedding model, which is about one-third the size of similar models, is optimized for top-tier search performance, providing an efficient and cost-effective solution for companies seeking to integrate their unique datasets with LLMs as part of a Retrieval-Augmented Generation (RAG) or semantic search service.
Snowflake has recently added models from Reka and Mistral AI, ensuring that customers have access to the latest, most high-performing LLMs within the data cloud. The company also recently announced an expanded partnership with NVIDIA, driving continued AI innovation. Snowflake's data cloud, combined with a full-stack NVIDIA-accelerated platform, provides a secure and powerful infrastructure and computing power to leverage the productivity of AI across industries. Snowflake Ventures has recently invested in Landing AI, Mistral AI, and Reka to support customers in realizing value from LLMs and AI within their own business data.