A few dents aside, AI is still surging

Today: another batch of earnings reports shows that enterprises are still investing heavily in the raw materials for generative AI, Microsoft takes a careful step away from its dependence on OpenAI's models, and the latest enterprise moves.

A few dents aside, AI is still surging
Photo by Isaac Smith / Unsplash

Welcome to Runtime! Today: another batch of earnings reports shows that enterprises are still investing heavily in the raw materials for generative AI, Microsoft takes a careful step away from its dependence on OpenAI's models, and the latest enterprise moves.

(Was this email forwarded to you? Sign up here to get Runtime each week.)


Number (still) go up

The first half of 2025 contained plenty of signs that the pace of AI innovation is slowing down, from OpenAI's late and underwhelming GPT-5 launch to the fact that most companies are still trying to figure out how to make generative AI work as part of their tech strategy. It's clear, however, that we're also still in the middle of a once-in-a-generation infrastructure overhaul that those companies hope will be the foundation for realizing the promise of large-language models.

Several picks-and-shovels AI suppliers reported earnings this week that show there's still a lot of demand for the tools that businesses need to build AI apps. After MongoDB raised its revenue guidance for the year on Tuesday, Snowflake and Nvidia released earnings results on Wednesday that exceeded expectations.

Nvidia, of course, will go down in history as the big winner of the generative AI boom, going from second-quarter revenue of $6.7 billion in 2022 — months before the launch of ChatGPT — to an astonishing $46.7 billion in its most recent quarter. The GPU giant is no longer throwing up triple-digit percentage increases in revenue after this run, but customers are clearly snapping up its latest Blackwell chips.

  • The overwhelming majority of Nvidia's revenue now comes from data-center customers, who spent $41.1 billion on AI and networking chips during the quarter.
  • That's up 56% compared to a year ago, although just shy of what analysts were predicting according to CNBC.
  • Around half of Nvidia's data-center revenue comes from cloud providers, and Nvidia CFO Colette Kress told analysts that she expects "$3 trillion to $4 trillion in AI infrastructure spending" by 2030, according to Reuters.

At some point, however, businesses will need to see returns from their investments into data tools and computing infrastructure to make that prediction come true. MIT's recent report on the failure of most enterprise generative AI apps got a ton of attention, and next week we might learn how much longer it will take Salesforce to generate meaningful revenue from its agentic AI push after it tempered expectations earlier this year.

  • Ramaswamy acknowledged that while the coding benefits of LLMs are now well understood, most other parts of a modern business, such as processing claims or producing regulatory reports, are still struggling to see the benefits of generative AI.
  • "All of those are areas where the application of data and AI is very much in its infancy. So I think there is honestly years of work ahead in terms of the value that we can get from AI," he said.
  • Right now when it comes to AI, most businesses are acting like novice golfers that just shelled out thousands of dollars on equipment without knowing how to swing a club.
  • They'll probably get better with practice, but they're not going to be spending like that every year.

A quick programming note: Runtime will be out of the country for the next two weeks and will return on Tuesday, September 16th. Thanks for your support!


MAI way

As Microsoft continues to consciously uncouple its entanglement with OpenAI, it has put more effort into developing its own set of large-language models. On Thursday it introduced a new model that could underpin its future AI products and services should its exclusive deal with OpenAI come to an end.

MAI-1-preview (short for Microsoft AI) is the company's "first foundation model trained end-to-end and offers a glimpse of future offerings inside Copilot," it said in a blog post. Alongside the new foundational model, it also released MAI-Voice-1, which is a speech-generation model that will allow developers to generate audio snippets based on text prompts.

Microsoft said it had begun testing MAI-1-preview on LMArena, where developers and enthusiasts compare the performance of various models, and that it would release more details about it soon. "This model is designed to provide powerful capabilities to consumers seeking to benefit from models that specialize in following instructions and providing helpful responses to everyday queries," it said, which means Microsoft and OpenAI are now officially frenemies.


Enterprise moves

Mike Price is the new chief revenue officer at DTEX, joining the insider-risk management company after sales leadership roles at Ushur, ForgeRock, and Oracle.

Reed Birnbaum is the new chief financial officer at DataHub, joining the data catalog company after financial leadership roles at Sirona Medical and RMS.


The Runtime roundup

Dell beat Wall Street's estimates for revenue and profit citing increased sales of AI servers, but the after-hours crowd didn't care for its third-quarter earnings projection.

Transunion acknowledged a data breach involving the personal information of 4.4 million people, and folks are just starting to really stack up those months of free credit-monitoring services at this rate.


Thanks for reading — see you in September!

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.