translation

This is an AI translated post.

스타트업 커뮤니티 씬디스 (SeenThis.kr)

Snowflake Launches 'Arctic,' an Enterprise-Grade LLM with Industry-Leading Openness

Select Language

  • English
  • 汉语
  • Español
  • Bahasa Indonesia
  • Português
  • Русский
  • 日本語
  • 한국어
  • Deutsch
  • Français
  • Italiano
  • Türkçe
  • Tiếng Việt
  • ไทย
  • Polski
  • Nederlands
  • हिन्दी
  • Magyar

Summarized by durumis AI

  • Snowflake has released 'Arctic,' an open-source large language model (LLM) with best-in-class performance and efficiency.
  • Arctic is available for free commercial use under the Apache 2.0 license and supports customization through various framework integrations.
  • Through Arctic, Snowflake empowers businesses to leverage their data and create practical AI/machine learning applications, providing the data foundation and cutting-edge AI building blocks needed.

Snowflake adds a state-of-the-art, open-source large language model (LLM) to its Arctic model family, delivering top-tier performance and efficiency in its class.

Based on the Apache 2.0 license, the open-source model offers flexibility and customization with support for various frameworks.

Snowflake, a leading global data cloud company, has launched Snowflake Arctic, an enterprise-grade large language model (LLM) with industry-leading openness and performance.

Designed with Snowflake's unique Mixture-of-Experts (MoE) approach, Arctic provides top-tier performance and productivity. It is optimized to handle complex business requirements, meeting the highest standards in various areas such as SQL code generation and command execution.

Arctic is particularly notable for its Apache 2.0 license, which allows for free commercial use. Snowflake has made public detailed information on the AI learning methods, establishing a new open standard for enterprise-grade AI technology. The Arctic LLM is part of the Snowflake Arctic model family, which includes a text embedding model for search use.

"Snowflake's AI research team is at the forefront of innovation in AI, marking a pivotal moment for our company," said Sridhar Ramaswamy, CEO of Snowflake. "By releasing open-source AI with industry-leading performance and efficiency to the AI community, Snowflake is expanding the potential of open-source AI. This also elevates Snowflake's AI capabilities, allowing us to provide customers with capable and trustworthy AI models."

◇ Arctic: An open-source LLM that supports broad collaboration

According to a recent report by Forrester, a market research firm, approximately 46% of AI decision-makers at global companies said they are using existing open-source LLMs to implement generative AI within their organizations as part of their AI strategy. Snowflake's data cloud platform is currently used as a foundation for data by over 9,400 companies and organizations worldwide. These organizations now have access to data through LLMs that offer the highest levels of openness in the industry.

Arctic, an open-source model under the Apache 2.0 license, provides flexibility with code templates and the option to choose inference and learning methods. Users can employ and customize Arctic with popular frameworks preferred by enterprises, including NVIDIA NIM, NVIDIA TensorRT-LLM, vLLM, and Hugging Face. Arctic is available for immediate use through serverless inference in Snowflake Cortex. Snowflake Cortex, a fully managed service, provides machine learning and AI solutions on the data cloud, along with a diverse range of model listings, including Hugging Face, Lamini, Microsoft Azure, NVIDIA API Catalog, Perplexity, and Together AI. Arctic use will also be available on Amazon Web Services (AWS).

◇ Excellent resource efficiency and top-tier performance

Snowflake's AI research team consists of leading researchers and system engineers. Arctic was built in less than three months. Amazon Elastic Compute Cloud (Amazon EC2) P5 instances were used for model training, and training costs were only one-eighth of comparable models. Snowflake is setting a new benchmark for the training speed of advanced, open-source enterprise models, ultimately enabling users to create cost-effective, customized models at optimal scales.

Arctic's differentiated MoE design enhances both learning systems and model performance through meticulously designed data assembly tailored to business requirements. By activating 480 billion parameters 17 at a time, it delivers exceptional token efficiency, industry-leading quality, and optimal outcomes. Arctic activates about 50% fewer parameters than DBRX and about 75% fewer parameters than Llama 3 70B for significant efficiency improvements during inference or training. It surpasses the performance of prominent existing open-source models like DBRX and Mixtral-8x7B in coding (HumanEval+, MBPP+) and SQL generation (Spider) while exhibiting top-tier performance in general language understanding (MMLU, Large Multitask Language Understanding).

◇ AI innovation for everyone, led by Snowflake

Snowflake provides the data foundation and cutting-edge AI building blocks necessary for companies to create practical AI/machine learning applications leveraging their own data. When customers use Arctic through Snowflake Cortex, it becomes easier to build production-ready AI apps at scale within the security and governance scope of the data cloud.

The Snowflake Arctic model family, which includes the Arctic LLM, also encompasses Arctic embed, one of Snowflake's recently released state-of-the-art text embedding models. It is open-source under the Apache 2.0 license and free to use within the open-source community. The family consists of five models, available directly on Hugging Face and will be available in private preview built into Snowflake Cortex. This embedding model, which is about one-third the size of similar models, is optimized for top-tier search performance, providing an efficient and cost-effective solution for companies seeking to integrate their unique datasets with LLMs as part of a Retrieval-Augmented Generation (RAG) or semantic search service.

Snowflake has recently added models from Reka and Mistral AI, ensuring that customers have access to the latest, most high-performing LLMs within the data cloud. The company also recently announced an expanded partnership with NVIDIA, driving continued AI innovation. Snowflake's data cloud, combined with a full-stack NVIDIA-accelerated platform, provides a secure and powerful infrastructure and computing power to leverage the productivity of AI across industries. Snowflake Ventures has recently invested in Landing AI, Mistral AI, and Reka to support customers in realizing value from LLMs and AI within their own business data.



seenthis.kr
스타트업 커뮤니티 씬디스 (SeenThis.kr)
스타트업 커뮤니티 씬디스 (http://SeenThis.kr Startup Community web) 씬디스는 스타트업 커뮤니티입니다. 1. 모르면 물어보세요 2. 알면 답해주세요
seenthis.kr
Snowflake Releases Data Trends 2024 Report: AI Developers Create an Average of 90 Apps Per Day Snowflake has revealed that the proportion of chatbots among LLM apps has increased based on a survey of over 9,000 customers. Developers prefer Python, and the volume of unstructured data processing has also increased. Learn more in the Snowflake Data Tr

May 16, 2024

SK C&C Unveils 'Soluer LLMOps,' a Platform Supporting Customized sLLM Implementation for Clients SK C&C has launched 'Soluer LLMOps,' a platform for building customized small-scale large language models (sLLMs) for enterprises. The platform supports easy creation of sLLMs using drag-and-drop functionality, leveraging various foundation models such as

May 20, 2024

Flitto and Upstage Sign MOU for AI Language Data Construction Flitto and Upstage have signed a business cooperation agreement to build low-resource language data in the Asian region. Through this, the two companies will strengthen their cooperation to improve the performance of large language models, and will contri

May 9, 2024

Building an AI Full Stack with Open Source New open source LLM (Large Language Model) models are emerging in the AI ecosystem. Powerful models with open licenses, such as Mistral, Llama, and phi-2, have been released, and various tools to use them are also being developed. From LLM frameworks such
RevFactory
RevFactory
RevFactory
RevFactory

February 5, 2024

Apple's OpenELM / MS's Phi-3 / Meta's Llama 3 Released Major tech companies such as Apple, Microsoft, and Meta are injecting new energy into the AI industry by recently releasing their own large language models. These newly released models are evolving in various directions, including size reduction, data opt
해리슨 블로그
해리슨 블로그
해리슨 블로그
해리슨 블로그

April 27, 2024

The Paradox of Leading AI Models, Transparency The lack of transparency in cutting-edge AI systems has emerged as a serious problem. Researchers at Stanford University analyzed 10 AI systems, including GPT-4, and concluded that there are no models that transparently disclose information such as data s
Byungchae Ryan Son
Byungchae Ryan Son
Byungchae Ryan Son
Byungchae Ryan Son
Byungchae Ryan Son

May 14, 2024

What is LLM (Large Language Model)? Large Language Models (LLMs) are a core technology in artificial intelligence, learning from vast amounts of text data to acquire language processing abilities similar to humans. They can be used in various fields such as chatbots, translation, and text g
세상 모든 정보
세상 모든 정보
세상 모든 정보
세상 모든 정보

April 1, 2024

Mr. Know-All – 2023.7 The first issue of "Mr. Know-All," a monthly AI magazine in July 2023, introduces the latest AI technologies and trends, including Claude 2, Azure OpenAI, LangChain, and LlamaIndex. In particular, it provides a detailed explanation of LlamaIndex, which em
Pilot AISmrteasy
Pilot AISmrteasy
Pilot AISmrteasy
Pilot AISmrteasy

March 21, 2024

Mr. Know-All Issue 6 - March 2024 We introduce LM Studio, a platform that allows you to run open source LLMs such as LLaMa, Falcon, MPT, and StarCoder locally, and various AI tools and services such as Devin, an AI software engineer, and crewAI, a multi-agent automation platform. We also
Pilot AISmrteasy
Pilot AISmrteasy
Pilot AISmrteasy
Pilot AISmrteasy

March 21, 2024