Meta______________3.thumb_head

Photo/Zheng Yuhang (NBD)

NO.1 Meta to Launch Llama3 Model Next Month

Meta announced that it plans to release its next-generation large model, Llama 3, next month. Different performance versions of this model will be gradually rolled out throughout the year to empower various Meta products.

Commentary: The release of the new model will further enhance Meta’s AI capabilities, competing with tech giants like Microsoft and Google.

NO.2 Intel Releases New AI Chip

Intel is set to launch a new version of its AI chip called Gaudi 3, which will be fully available in the third quarter. This chip is designed to help train AI systems and run finished software. According to Intel’s assessment, Gaudi 3 will be faster and more energy-efficient than Nvidia’s H100 chip.

Commentary: Intel hopes to challenge Nvidia’s dominance in the AI chip market, which will stimulate competition in the AI chip market.

NO.3 Microsoft Strengthens AI Business in Japan

Microsoft will invest approximately $2.9 billion in Japan over the next two years. This is the company’s largest investment in Japan, aimed at strengthening the infrastructure conditions for generative AI, such as data centers.

Commentary: Microsoft’s increased investment in Japan’s AI business will accelerate the development of AI technology in Japan and capture a share of the Japanese AI market.

NO.4 Google Cloud Computing Conference Launches New Products

Google launched a series of AI products at its annual cloud computing conference, including the AI-driven video creation tool Google Vids, Gemini in Databases to simplify database operations, the enterprise-oriented AI code completion and assistance tool Gemini Code Assist, and the enhanced image generation tool Imagen 2.

Commentary: Google’s intensive product releases are a comprehensive competition with Microsoft in the AI field.

NO.5 Musk: Training Grok 3 Requires 100,000 Chips

Elon Musk recently stated that training the next-generation AI chatbot, Grok 3, requires 100,000 Nvidia H100 GPU chips, while the current Grok 2 uses about 20,000 H100 chips. Musk revealed earlier this year that Tesla will spend over $500 million on Nvidia AI chips this year alone.

Commentary: Musk’s statement shows that the demand for computing power in AI model training is still huge.

Disclaimer: The content and data in this article are for reference only and do not constitute investment advice. 

 

Editor: Alexander