The past 24 hours have made it clear that the AI landscape is undergoing a significant transformation. Gone are the days of simply focusing on developing the best AI models; today's competition revolves around controlling gateways, allocating power budgets, and mastering the full stack. As platforms draw new boundaries, compute is now planned in gigawatts, and big-tech org charts are converging around model + silicon + systems.

Meta's WhatsApp Conundrum

In a move that has raised eyebrows, Meta plans to restrict third-party general-purpose AI (think ChatGPT and Copilot) from delivering core AI functions via the WhatsApp Business API starting January 15, 2026. This limitation will only allow "assistive" use cases like order lookups or flight alerts, while aggressively promoting Meta AI. The stated reason is "stability" and "API intent," but strategically, it appears to be a deliberate land-grab for the highest-value conversational surface area. With the EU scrutinizing this move, it's unsurprising that they're examining whether Meta is using its control over WhatsApp chatbots and the Business API to limit third-party AI providers.

OpenAI's Compute Conquest

In another significant development, OpenAI has expanded its partnership with AMD, which could deliver up to 6 gigawatts of chips. This signals that frontier AI planning is now constrained by power, not just GPUs. For OpenAI, diversifying away from a single NVIDIA path is about lead times, allocation risk, and price. If AMD can ship reliably and offer a usable software stack with real engineering support, OpenAI gains negotiating leverage on cost, schedule, and supply certainty.

NVIDIA's AI Infrastructure

Meanwhile, NVIDIA has partnered with the U.S. DOE on "Genesis Mission AI," aiming to strengthen U.S. leadership through AI infrastructure and R&D across nuclear, quantum, biology, and materials science. This five-year plan exceeds $120B, with roughly 60% allocated to compute buildout and model R&D. NVIDIA is positioned to lead at least seven next-gen AI supercomputers, solidifying its position as a leader in the field.

Amazon's AGI Reorg

In a move that suggests Amazon is shifting its focus from "model lab" to a systems-and-agents posture, AGI VP Rohit Prasad is leaving at year-end. Reinforcement learning expert Pieter Abbeel will take over, as Amazon reorganizes AGI and merges it with chip R&D and quantum teams. This reorg may signal a shift towards more emphasis on infrastructure economics, integration, and agentic workflows, rather than just raw model capability.

The Future of AI

As the landscape continues to evolve, one thing is clear: the winners will be those who can iterate chips, racks, networking, and scheduling alongside model strategy. With the focus shifting from models to gateways, compute, and systems, the stakes are higher than ever before. Who will come out on top in this new era of AI competition?