Mexico City, 07:11 (UTC-6), January 31, 2026
Nvidia’s plan to pour up to $100 billion into OpenAI has hit a snag, according to The Wall Street Journal on Friday. Some folks inside the chip giant have raised doubts. “We have been OpenAI’s preferred partner for the last 10 years. We look forward to continuing to work together,” an Nvidia spokesperson said. OpenAI didn’t respond immediately to requests for comment. (Reuters)
OpenAI has hit pause while seeking new funding to cover costs for data centers — those massive server warehouses powering AI training and operations. At the same time, big tech players are jockeying to secure control over the AI workloads. Reuters revealed Thursday that Amazon is in early talks to invest up to $50 billion, with CEO Andy Jassy in discussions alongside OpenAI’s Sam Altman. (Reuters)
The Information reported that Nvidia, Amazon, and Microsoft are in talks about investments that could reach $60 billion, with Nvidia possibly contributing up to $30 billion and Microsoft under $10 billion, according to Reuters. Amazon’s involvement might be linked to separate discussions on cloud server rentals and offering OpenAI products like enterprise ChatGPT subscriptions. (Reuters)
The Journal reported that Nvidia and OpenAI are reconsidering their partnership, now eyeing a smaller equity investment—tens of billions rather than the original sum. Nvidia CEO Jensen Huang has privately called the initial deal non-binding and flagged worries about competition from Google, Anthropic, and others. (The Wall Street Journal)
Bloomberg News, referencing the Journal, reported that talks have collapsed—highlighting how fast attitudes shift in the AI megadeal arena. OpenAI is behind ChatGPT, and Nvidia leads the processor market powering these AI systems. (Bloomberg)
Amazon is reportedly considering a wider partnership to integrate OpenAI’s models into its products and platforms, granting employees access for their work, Bloomberg reported, citing an insider. This potential deal might build on the current arrangement where Amazon provides computing power to OpenAI, the report added. (Bloomberg)
Back in September, Nvidia and OpenAI revealed they’d signed a letter of intent—a non-binding agreement—to roll out at least 10 gigawatts of “compute,” meaning the chips and servers behind AI. Gil Luria from D.A. Davidson flagged a risk: Nvidia could end up as an “investor of last resort.” Meanwhile, Kim Forrest of Bokeh Capital commented, “This sounds like Nvidia is investing in its largest customer.” (Reuters)
OpenAI is on the hunt for more computing power, reportedly striking a deal exceeding $10 billion over three years with Cerebras, a competitor to Nvidia, for up to 750 megawatts of capacity, Reuters revealed in mid-January. OpenAI said this move to include Cerebras in their compute lineup focuses on speeding up AI responses, specifically for inference — running trained models to handle queries. (Reuters)
Key terms remain in flux, and Reuters couldn’t immediately confirm the Journal’s story. Should Nvidia’s involvement decrease, OpenAI will probably rely more heavily on a mix of chip vendors and cloud providers to meet the capacity demands of its upcoming models. (Investing)