Today: How AWS is trying to take advantage of a generational lead in cloud computing in the agentic AI era, Google Cloud snaps up a piece of business AWS would have very much liked for itself, and the latest enterprise moves.
Today: several wild days prove why AI-driven coding is at the center of enterprise tech, Nvidia will once again be allowed to sell chips designed around export controls to Chinese customers, and the latest funding rounds in enterprise tech.
Today: How AWS is trying to take advantage of a generational lead in cloud computing in the agentic AI era, Google Cloud snaps up a piece of business AWS would have very much liked for itself, and the latest enterprise moves.
Welcome to Runtime! Today: How AWS is trying to take advantage of a generational lead in cloud computing in the agentic AI era, Google Cloud snaps up a piece of business AWS would have very much liked for itself, and the latest enterprise moves.
(Was this email forwarded to you?Sign up here to get Runtime each week.)
Disruptors get disrupted
AWS is finally ready to acknowledge that the launch of ChatGPT caught it flat-footed in late 2022 and 2023: "We didn’t have some of the whiz-bangy things that you could get out there quickly," CEO Matt Garman told Fortune this week during the company's New York Summit. His predecessor argued over and over again while trying to buy time that the generative AI era was just getting started, rendering any first-mover advantage moot in, as AWS executives like to say, "the fullness of time."
It consists of several tools that can make the deployment process easier, such as a serverless runtime to execute the agent, memory to help agents learn from events in the past, and observability to make sure everything is running smoothly.
"With agents comes a shift to service as a software. This is a tectonic change in how software is built, deployed and operated,” Sivasubramanian said during the event, according to VentureBeat.
But agents will still have to work with everything that came before them, and AgentCore also includes identity-management technology for making sure agents can securely access corporate assets and play nicely with existing APIs.
AWS has one key advantage compared to the dozens of enterprise software companies trying to own the agentic era: A huge number of businesses are using its cloud platform in some fashion or another already. But agentic AI is a new challenge.
A decade ago AWS was the place where startups and new business ventures were built, but today it is trying very hard to move its existing customer base into a new era, which is what older enterprise tech companies like Oracle, IBM, and HPE have been trying to do for a long time.
While it's never been easier to mix and match services from different cloud providers in an application, it's not exactly easy, and there are a lot of companies that would be content to build their agentic AI apps in the same place they developed and currently run their traditional apps.
And now that there are several different options when it comes to competitive foundation models (unlike the scene in late 2022) and the OpenAI-Microsoft relationship has never been weaker, there's a chance AWS could strike a deal to put future versions of OpenAI's models in Bedrock.
Platform shifts are tricky, and they tend to produce winners that were born of the new era free from the baggage of the previous one. One day after its New York event, AWS laid off "hundreds" of people on Thursday as it tries to fashion a new set of building blocks for enterprise software.
But this shift represents perhaps the greatest threat to its perch atop the enterprise world since it grabbed that spot around a decade ago.
Outside looking in
While it chases AI business among existing customers, AWS remains shut out of one of the biggest money-makers of the generative AI era: OpenAI's compute budget. After Microsoft agreed to allow OpenAI to find computing capacity outside Azure's data centers last year, the GPT maker has now signed deals with CoreWeave, Oracle, and now Google Cloud.
On Wednesday OpenAI confirmed reports from earlier this year that it will tap Google Cloud for excess computing capacity as it trains future versions of its GPT models. The company will run ChatGPT Enterprise on Google Cloud's infrastructure in several regions around the world, which has got to be a little weird for the folks at Google given that it would love to call those enterprise customers its own.
But with billions in the bank and quite a bit more in store should OpenAI find a way to convince Microsoft to modify their agreement and allow OpenAI to become a for-profit corporation, that kind of business is hard to turn down. However, assuming Project Stargate falls short of the lofty expectations laid out in January, at some point OpenAI may have no choice but to turn to the cloud infrastructure leader.
Paul Smith is the new chief commercial officer at Anthropic, the first person at the foundation model company to hold that role following sales leadership roles at Microsoft, Salesforce, and ServiceNow.
ServiceNow's bid to acquire Moveworks will face additional regulatory scrutiny as it tries to move into the HR software market, according to Bloomberg.
Tom Krazit has covered the technology industry for over 20 years, focused on enterprise technology during the rise of cloud computing over the last ten years at Gigaom, Structure and Protocol.
Today: several wild days prove why AI-driven coding is at the center of enterprise tech, Nvidia will once again be allowed to sell chips designed around export controls to Chinese customers, and the latest funding rounds in enterprise tech.
Today: xAI's new Grok 4 model looks impressive assuming you can ignore everything else about the company, MCP's security flaws are becoming apparent, and the latest enterprise moves.
Today: Why CoreWeave just shelled out $9 billion in stock for Core Scientific, Ingram Micro begins to recover from a holiday weekend ransomware attack, and the latest funding rounds in enterprise tech.