Anthropic creates a Skills issue; Oracle jumps in the lake
Today on Product Saturday: Anthropic may have found a way to deliver on the promise of agents, Oracle puts up a lakehouse, and the quote of the week.
Today on Product Saturday: Anthropic may have found a way to deliver on the promise of agents, Oracle puts up a lakehouse, and the quote of the week.
Welcome to Runtime! Today on Product Saturday: Anthropic may have found a way to deliver on the promise of agents, Oracle puts up a lakehouse, and the quote of the week.
(Please forward this email to a friend or colleague! And if it was forwarded to you, sign up here to get Runtime each week.)
Skills to pay the bills: Over a year of promises that AI agents are going to change the world, Anthropic might have done more actual work in service of that goal than any other company in enterprise tech. Last year it introduced MCP, and this week it launched Agent Skills, a new feature for its Claude chatbot that provides "custom onboarding materials that let you package expertise, making Claude a specialist on what matters most to you," it said in a blog post.
LLMs tend to work much better when the user writes a comprehensive prompt with a lot of context about the task they wish to execute, but doing that over and over again is a real pain. Skills could be an interesting shortcut that could also save money, according to Simon Willison: "This is very token efficient: each skill only takes up a few dozen extra tokens, with the full details only loaded in should the user request a task that the skill can help solve."
Chainguard for MCP: Speaking of MCP, almost every company working on agentic AI tools has added support for MCP into their products, but concerns about security remain a big issue. Software developers worried about supply-chain security can use verified packages from companies like Chainguard as opposed to something they found on the internet, and a new company called MCPTotal just launched a similar platform for MCP servers.
"MCPTotal delivers the first end-to-end platform for organizations to safely adopt MCPs while also improving their usability," the company said in a press release. Early adopters of AI agents have had the most success when they feed their agents with as much data as possible, and MCP makes it much easier to connect the dots.
Big enough for a yacht: The data-management industry started putting up lakehouses years ago, and in keeping with tradition, Oracle is just getting around to it now. This week at Oracle AI World it announced the Oracle Autonomous AI Lakehouse, the next generation of its Autonomous Data Warehouse that now comes with support for Apache Iceberg.
Iceberg allows companies to store data in a format that has been increasingly embraced by enterprise tech as an open standard, which theoretically makes it much easier to move that data to wherever it is needed. Oracle also said the new lakehouse would support both catalog formats from Databricks and Snowflake, and customers can run it on the Big Three cloud platforms in addition to Oracle's own infrastructure service.
Storage unit: If CoreWeave has a long-term future as a cloud provider, it will need to offer customers more than just easy access to Nvidia GPUs. The company has been beefing up its software strategy over the last year, and this week it introduced a new storage system it said was designed specifically for AI workloads.
CoreWeave AI Object Storage "was engineered for the unique demands of AI, using a distributed architecture that separates compute from storage while preserving ultra-low-latency data access," the company said in a blog post. The idea was to make it easier and faster to move data between cloud regions, and there are no egress fees, which will attract some attention.
Once more, with feeling: Intel found itself in its current predicament thanks to a series of product debacles, and the only way it will get out of that mess is to develop new products that will force enterprise tech to pay attention. It has tried and failed several times to develop a proper competitor to Nvidia's GPUs, but got back on that horse this week with the introduction of Crescent Island, a new GPU design that is going after efficiency-minded buyers.
The new chip "is being designed to be power and cost-optimized for air-cooled enterprise servers and to incorporate large amounts of memory capacity and bandwidth, optimized for inference workflows," Intel said in some sort of Axios-like blog post. That could be an interesting strategy given that future top-of-the-line GPUs are expected to require liquid cooling, which has forced cloud providers to redesign their data centers.
Given the zeal with which every technology vendor has thrown themselves into generative AI, you'd think they smell money. As it turns out, that's proving more difficult than expected: "Among those developing AI products or features, 70 percent say delivery costs are eroding margins, with 52 percent planning new pricing models to mitigate cloud spend," according to a new report from Revenera.
"Having listened closely to my fellow San Franciscans and our local officials, and after the largest and safest Dreamforce in our history, I do not believe the National Guard is needed to address safety in San Francisco." — Salesforce CEO Marc Benioff, a resident of Hawaii, finally coming to his senses Friday and bringing a fitting conclusion to a Dreamforce week during which nobody talked about Salesforce and everybody talked about him.
Oracle's stock fell sharply Friday as investors started to realize that its future revenue projections and cloud deals with companies like OpenAI are, shall we say, aspirational.
Thanks for reading — see you Tuesday!