Microsoft and OpenAI are reportedly moving forward with plans for a groundbreaking data center project that will include an AI supercomputer named Stargate.
Report by Anissa Gardizy and Amir Efrati information The goal of the project, which Microsoft has reportedly provided more than $100 billion in funding and has a start date set for 2028, claims to be to reduce the companies' dependence on Nvidia. Major companies involved in AI are increasingly taking on the challenge of AI.
Microsoft and OpenAI's plan includes five phases, with Stargate reportedly being the fifth and most ambitious phase.
Data centers become supercomputers
The cost of the project is attributed to long-standing “sources familiar with the plans” (according to the information, these sources include “someone who has spoken with OpenAI CEO Sam Altman about the project, and Microsoft However, neither Microsoft nor OpenAI have yet commented on the details of the project.
The new data center project is expected to push the limits of AI capabilities and could cost more than $115 billion. That's more than three times the amount Microsoft spent on capital expenditures on servers and equipment last year. Microsoft is currently working on a small, stage 4 supercomputer for OpenAI, expected to launch around 2026. information Claim.
Shedding more light on the report, Next platform “The first thing to note about the rumored 'Stargate' system that Microsoft plans to build to support the computational needs of its large-scale language model partner OpenAI is that the people talking, So Sam Altman, CEO of OpenAI – I'm talking about data centers, not supercomputers. That's because the data center, and he probably has as many as a million of his XPU computing devices, multiple data centers in the region will become supercomputers. ”
Next platform It also says that if Stargate ever comes to fruition, it would be “based on future-generation Cobalt Arm server processors and Maia XPUs, with Ethernet scaling from hundreds of thousands to 1 million XPUs on a single machine.” I don't It's based on Nvidia GPUs and interconnects, so if this rumor is to be believed, it seems like a safe option.