Google connects the clouds; DeepSeek re-enters the chat

Today on Product Saturday: Google's new cross-cloud lakehouse service lets BigQuery talk to S3, DeepSeek releases its most powerful open-source model to date, and the quote of the week.

Google connects the clouds; DeepSeek re-enters the chat
Photo by Marc Wieland / Unsplash

Welcome to Runtime! Today on Product Saturday: Google's new cross-cloud lakehouse service lets BigQuery talk to S3, DeepSeek releases its most powerful open-source model to date, and the quote of the week.

Please forward this email to a friend or colleague! If it was forwarded to you, sign up here to get Runtime each week, and if you value independent enterprise tech journalism, click the button below and become a Runtime supporter today.


Ship it

Bridge over cloudy waters: Businesses tend to accumulate data in several different places over time due to acquisitions, new projects, or partnerships, and AI agents will need easy access to those disparate data sources to operate correctly. Google has been preaching the gospel of multicloud computing for a very long time, and this week at Cloud Next '26 it introduced new capabilities in its Lakehouse service that can tap into AWS and Azure data.

The new cross-cloud interconnect allows popular Google tools like BigQuery to access data stored in even-more popular storage services like AWS's S3 through the Iceberg format, the company said in a blog post. "This zero-copy data integration directly challenges the walled-garden approach of some competitors and signals a move toward a more fluid query fabric," Futurum Group's Brad Shimmin said in a blog post.

Defense wins championships: The last few weeks have been a very interesting time for AI-powered security, with the announcement of Anthropic's Project Glasswing to study the mythical capabilities of its Mythos Preview model. Google introduced three new security agents this week at Cloud Next designed to help defenders keep up with what many experts believe will be a surge in AI-powered attacks over the next few years.

The Threat Hunting agent "can help teams proactively hunt for novel attack patterns and stealthy adversary behaviors that bypass traditional defenses," and the Detection Engineering agent can help companies find gaps in their defenses and automate responses, the company said in a blog post. Another agent called Third-Party Context does pretty much what that name would suggest; it allows customers to pull in data from third-party sources to inform their security posture.

Four on the floor: DeepSeek caused quite a stir in AI circles more than a year ago with the initial release of its open-source model, which was more or less as good as anything that had been produced by the frontier labs at that point and developed through a much cheaper process. This week it introduced two new models called V4-Pro and V4-Flash, which promise strong performance at a fraction of the cost of the top closed-source models.

"Welcome to the era of cost-effective 1M context length," DeepSeek said in a post on X introducing the new models, referring to their ability to process 1 million tokens from a single prompt. "DeepSeek’s advanced new V4 model reinforces China’s reputation for cost-efficient AI, though it is unlikely to trigger another 'DeepSeek Moment' that disrupts technology markets," according to Bloomberg Intelligence's Robert Lea.

Eyes wide open: Observability tools have been moving to incorporate the open-source OpenTelemetry standard for several years now, and Grafana Labs has been at the forefront of that push. But open-source tools can be more difficult to manage than a lot of end users would like, and this week at its GrafanaCON event the company unveiled new services that were designed to make it easier to use OpenTelemetry with Linux and Kubernetes.

New packages make it easier to install OpenTelemetry with a single command, and a new version of Grafana Alloy — its distribution of the OpenTelemetry connector — that works with YAML. "According to Grafana Labs’ 2026 Observability Survey, a majority of organizations are using OpenTelemetry or are actively migrating toward it, signaling a clear industry shift toward vendor-neutral instrumentation," the company said in a press release.

Skills issue: While AI-powered coding agents continue to transform software development, they're not as well-suited to writing infrastructure as code on newer, more complicated AI infrastructure designs. Anyscale's Ray project is at the forefront of many of those AI infrastructure buildouts, and this week the company introduced new "skills" that were designed to work with Claude Code and Cursor to help developers provision infrastructure.

The three skills — Workload Skills, Platform Skills, and Infra Skills — allow developers to generate code for AI inference, debug workloads using observability tools, and deploy Ray right from those coding tools, the company said in a blog post. "While general-purpose agents can write Python, complex domains like distributed computing require specialized knowledge that AI coding agents simply do not have: how to plan for GPU memory, navigate the latest Ray APIs, or debug cluster failures," it said in the post.


Stat of the week

Tech leaders had very little success generating a return on their AI investments over the past couple of years, but the capacity issues plaguing most major AI services shows that demand for those services is definitely on the rise. However, new research from TE Connectivity shows that defining ROI is still tricky: 89% of respondents think they kind of get how those investments will pay off, but "only 19% of executives admit to having "full clarity" on how that ROI is defined and measured."


Quote of the week

"The experimental phase is behind us. And now the real challenge begins: How do you move AI into production across your entire enterprise?” — Google Cloud CEO Thomas Kurian, kicking off his keynote address Wednesday at Google Cloud Next with the put-up-or-shut-up question facing IT departments this year as agents gain momentum.


The Runtime roundup

Google announced that it will invest $10 billion in Anthropic this year and up to $40 billion over the next several years in exchange for spending commitments on its cloud platform, days after AWS announced similar plans.

Maine Governor Janet Mills vetoed a bill that proposed a moratorium on data-center construction until 2027, agreeing with its proponents that a pause is probably a good idea but killing the bill because it didn't exempt a project planned long before the current wave of local opposition started to grow.


Thanks for reading — see you Tuesday!

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.