Weekly Intel - 2026-05-03

Weekly Intel - 2026-05-03

The theme this week is alignment, or rather, the speed at which old allegiances are being abandoned for new ones. OPEC, corporate AI budgets, platform exclusivity, Pentagon ethics guardrails, and foreign ownership limits.

Energy & Transportation

UAE to leave OPEC The article content is paywalled, but the headline alone signals a structural fracture in the oil cartel that has coordinated global supply for decades. The UAE has long chafed at production quotas that constrain its capacity, and a formal exit would accelerate the unraveling of OPEC’s pricing power, potentially pushing oil prices lower and making supply forecasts far less predictable.

Belgium stops decommissioning nuclear power plants Belgium is nationalizing its entire nuclear fleet (all seven reactors, personnel, subsidiaries, and liabilities), reversing a two-decade phase-out policy. Prime Minister De Wever framed it as energy sovereignty; ENGIE signed a letter of intent for exclusive negotiations with a basic agreement expected by October.

AI Industry Moves

Uber torches 2026 AI budget on Claude Code in four months Uber burned through its entire 2026 AI budget by April, not because the tools failed, but because 95% of its engineers adopted them so aggressively that per-person API costs hit $500-$2,000 per month. This is the new shape of the AI cost problem: the ROI is obvious enough that usage explodes past any budget ceiling you set. Every CTO now faces the same problem: how do you govern consumption when the tool is genuinely making people faster and they won’t stop using it?

OpenAI models coming to Amazon Bedrock: Interview with OpenAI and AWS CEOs OpenAI is now free to distribute its models beyond Azure, and AWS Bedrock is the first major beneficiary. The restructured Microsoft-OpenAI deal (Microsoft keeps a non-exclusive license through 2032, stops paying revenue share, and retains first-ship rights only where it can actually deliver the infrastructure) is less a breakup than a mutual acknowledgment that OpenAI’s ambitions outgrew a single cloud. For enterprise buyers, the question is whether model access becoming commoditized across clouds shifts leverage permanently toward the infrastructure providers, or whether OpenAI’s direct presence on Bedrock makes it harder for AWS to differentiate on anything but price and tooling?

Google and Pentagon reportedly agree on deal for ‘any lawful’ use of AI Google has reportedly signed a classified agreement letting the Department of Defense use its AI models for “any lawful government purpose,” one day after employees publicly demanded Sundar Pichai block exactly that. The deal puts Google alongside OpenAI and xAI in the defense AI lane, while Anthropic got blacklisted for refusing to remove weapon and surveillance guardrails.

Mistral Medium 3.5 Mistral just shipped a 128B dense model under open weights alongside cloud-based coding agents that run asynchronously and in parallel, no laptop required. The model is designed for sustained coding and productivity work, runs self-hosted on as few as four GPUs, and powers a new “Work mode” agent capable of multi-step research and cross-tool actions. What’s actually changing is the model itself: from AI as a tool you actively operate to AI as a worker you dispatch and check back on.

FCC Funding Application Notes Paramount Will Be 49.5% Foreign-Owned Post-Merger The Paramount-WBD merger will leave the combined entity nearly half foreign-owned, with $24 billion from Saudi, Qatari, and Abu Dhabi sovereign funds making up the bulk of that stake. Paramount is framing these as passive, non-voting investors, a legal distinction the FCC will now have to weigh, but the practical reality is that a major U.S. broadcast and streaming company will depend on Gulf state capital for its competitive footing.

Who owns the code Claude Code wrote? The ownership question around AI-generated code hits three overlapping risks at once: the output may not be copyrightable at all, your employment agreement likely claims it regardless, and the training data may have introduced open source license obligations you never agreed to. Most companies shipping AI-assisted code haven’t updated their IP policies, contributor agreements, or documentation practices to reflect any of this. The practical question is whether the code in your product right now would survive a licensing dispute or an IP due diligence process during acquisition.

AI & Software

GitHub Copilot is moving to usage-based billing GitHub is shifting all Copilot plans from flat-rate subscriptions to token-based consumption pricing starting June 1, 2026, using “GitHub AI Credits” tied to actual model usage including input, output, and cached tokens. This is the infrastructure pricing model catching up to the product reality: Copilot has become an agentic platform consuming wildly variable compute per user, and flat per-seat pricing was never going to survive that shift. The question for any org with hundreds or thousands of Copilot seats: do you have any visibility into how much AI your developers actually consume today, and are you ready for a world where your dev tooling bill becomes as unpredictable as your cloud compute bill was circa 2015?

How ChatGPT serves ads OpenAI’s ad infrastructure is now visible in the wild, and it’s more sophisticated than most expected. Structured ad objects get injected directly into ChatGPT’s server-sent event stream mid-conversation, while a merchant-side tracking SDK called OAIQ closes the attribution loop by reporting product views back to OpenAI, the two halves connected by encrypted click tokens. They’ve built the plumbing for a business model that competes directly with Google’s intent-based advertising, except the “search results page” is now a dialogue rather than a search.


That’s what I’m watching. What caught your attention this week?

-Eric

Share

Get weekly insights on technology leadership

One idea per issue. No spam. Plus a free guide on measuring AI initiatives when the old metrics don't work.

Or download the free guide directly →