Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Meta accelerates innovation strategy through the release of its own AI chips
Meta recently revealed its self-developed artificial intelligence (AI) chips to dispel rumors of development setbacks and emphasized a new strategy. Previously, the tech industry reported that Meta faced difficulties in developing its planned AI training chips, leading to the cancellation of some projects. It appears that Meta is securing external support through large-scale AI chip supply agreements with NVIDIA, AMD, and Google, while accelerating its own chip development.
On the same day, Meta announced four models in the “Meta Training and Inference Accelerator” (MTIA) chip series: MTIA 300, 400, 450, and 500. The MTIA 300 has already entered production, and the remaining models are expected to be phased into Meta’s data centers before next year. These chips are primarily designed to support content recommendation and generative AI models for Meta’s core platforms like Facebook and Instagram.
Meta is simultaneously adopting a strategy of using external high-performance AI chips alongside internally developed ones, aiming to optimize performance for different functions. Meta stated that, unlike external chips suited for AI training, its self-developed chips mainly focus on AI inference tasks to improve cost efficiency. Notably, the chips unveiled feature high-bandwidth memory (HBM) to enhance data processing speed.
Meta also recognizes that the rapid development of AI technology makes traditional chip development cycles difficult to keep up with. Therefore, Meta has chosen a strategy of shortening development cycles and continuous iteration. However, global shortages of memory chip supplies could pose challenges, and concerns remain about how long such supply issues will last. While Meta has assured that it has secured the necessary memory for future production, the duration of these supply problems remains uncertain.
This trend may make the competitive and cooperative landscape within the AI industry more complex. It will be interesting to see what strategies Meta adopts to advance AI technology in the future. In the context of ongoing short-term competition and technological evolution, Meta’s choices will also have a significant impact on the global AI market.