OpenAI raises the AI coding stakes; MongoDB goes beyond the DB

Today on Product Saturday: OpenAI's new coding-specific version of GPT-5 hits the terminals, MongoDB gets into the application modernization game, and the quote of the week.

OpenAI raises the AI coding stakes; MongoDB goes beyond the DB
Photo by Fotis Fotopoulos / Unsplash

Welcome to Runtime! Today on Product Saturday: OpenAI's new coding-specific version of GPT-5 hits the terminals, MongoDB gets into the application modernization game, and the quote of the week.

(Was this email forwarded to you? Sign up here to get Runtime each week.)


Ship it

A model for Codex: The launch of OpenAI's GPT-5 large-language model last month was a bit underwhelming for those who were expecting a bigger leap from the new generation, but developers in the market for AI coding assistants got a little something extra this week. OpenAI released a custom version of GPT-5 designed specially to work with its Codex coding assistant, promising that "it’s more steerable, adheres better to AGENTS.md⁠ instructions, and produces higher-quality code — just tell it what you need without writing long instructions on style or code cleanliness," the company said in a blog post.

OpenAI also said the new model "has been trained specifically for conducting code reviews and finding critical flaws," a welcome move beyond straight code-generation tools to emphasize features that help developers ship working software. YouTuber Theo Browne thought the new model was an excellent step forward for OpenAI but urged the company to fix its web interface and extensions.

Java's silver release: Developers cranking away on AI coding assistants are probably using Python, TypeScript, or Rust to write their code, but even after all these years Java remains an indispensable component of modern software. This week Oracle released Java 25, a milestone version that the company said it would support for eight years.

The new features were designed to make Java more flexible and accessible to new developers, and to improve the performance of AI applications by reducing the memory footprint needed to run those apps, according to a press release. "With AI, people work with very large amounts of data, so this allows them to handle more AI workloads at lower cost thanks to improved density,” Oracle's Bernard Traversat told Developer Tech News.

AMPed up: Deferred maintenance has a way of catching up with everybody, and that's especially true for companies that rely heavily on applications built back when Donald Trump was a TV star. This week MongoDB moved beyond its database roots to launch a new service that promises to help customers modernize older applications before it gets even harder and more expensive to update.

MongoDB AMP consists of "an AI-powered software platform, a demonstrated delivery framework, and experienced AMP delivery engineers who oversee and guide the implementation process," the company said in a press release. "While AMP doesn’t represent vertical integration between the database and the app platform in the direct sense, it’s absolutely a move towards database players taking more ownership of the application stack," according to Rachel Stephens of Redmonk.

Your call is important to it: If AI agents are going to gain a foothold inside the enterprise, the customer-service department is easily the most promising candidate. Contact-center company ASAPP updated its flagship GenerativeAgent platform this week with new features that it said will make its customer-service agents more reliable and easier to build.

Putting your brand's reputation in the hands of an AI agent is still a bit of a leap of faith, and the new features in the platform focus on giving contact-center managers new ways to monitor conversations, understand what happened when something goes wrong, and fix it by updating the content or workflow that the agent is drawing from. Developers can also tap into a new API hub that ASAPP said in a press release would "speed and simplify how to connect workflows and orchestrate across systems."

Graphing calculator: Earlier this year Microsoft CEO Satya Nadella called Microsoft Fabric "the fastest-growing data analytics product in our history," just two years after it made its debut at Build 2023. This week Microsoft added two new features to Fabric that it believes will help enterprise customers get over the persistent last-mile problem when it comes to deploying generative AI apps.

Graph in Fabric is a new "low/no-code platform for modeling and analyzing relationships across enterprise data," and Maps in Fabric "brings geospatial analytics into Fabric, enabling users to visualize and enrich location-based data at scale," the company said in a blog post. The graphing capabilities were "perhaps the last big missing piece” in Fabric, Microsoft's Arun Ulagaratchagan told Silicon Angle, and they're based on technology developed at LinkedIn.


Stat of the week

Generative AI is clearly having its biggest impact on the process of software development, but operations teams have also been looking for ways to take advantage of its capabilities. According to new research from DuploCloud, "67% of teams increased AI investment in the past year, yet adoption remains focused on observability, analysis, and copilots. Only a small minority trust AI to take action in their infrastructure."


Quote of the week

"With AI infrastructure investments continuing to grow with the company expecting between $3 [trillion] to $4 [trillion] in total AI infrastructure spend by the end of the decade, the chip landscape remains [Nvidia’s] world, with everybody else paying rent, as more sovereigns and enterprises wait in line for the most advanced chips in the world.” — Dan Ives, tech analyst at Wedbush, on Nvidia's historic deal with Intel as told to The Guardian.


The Runtime roundup

Oracle and Meta are in discussions about a cloud-infrastructure services deal that could be worth up to $20 billion, according to Bloomberg, which would be a far more serious undertaking than Oracle's fingers-crossed $300 billion deal with OpenAI.

Meanwhile, Meta filed an application to join the Big Three cloud providers as a wholesale power trader, which would allow it to resell electricity to the open market should it find itself with excess capacity.


Thanks for reading — see you Tuesday!

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.