MongoDB hits 8.0; Microsoft's open-source data project
Today on Product Saturday: MongoDB focuses on performance and resilience, Microsoft tackles event handling with a new open-source project, and the quote of the week.
Today: Intel's badly needed new Xeon server processors, the most interesting launch from Cloudflare's birthday week, and the quote of the week.
Welcome to Runtime! The Saturday edition of this newsletter has been on hiatus for retooling, and we're ready to unveil the new format! Going forward, each Saturday we'll summarize some of the most notable product launches of the past week in enterprise tech, and tie up any other loose ends.
Today: Intel's badly needed new Xeon server processors, the most interesting launch from Cloudflare's birthday week, and the quote of the week.
(Was this email forwarded to you? Sign up here to get Runtime each week.)
Intel unveils new Xeons: Intel's road to recovery will depend on its ability to once again ship competitive server products at a regular cadence, and it appears to have done just that with the launch of the Granite Rapids chips this week. It also unveiled new Gaudi AI processors, which have the much harder job of going up against Nvidia's AI juggernaut.
But with Granite Rapids "Intel is once again trading blows with AMD, something that makes comparing socket-to-socket performance between the two a far less lopsided affair than it's been in years," according to a deep-dive analysis from The Register. The launch should give the beleaguered company some badly needed momentum going into the new year as cloud providers upgrade their fleets of servers in the fourth quarter.
Cloudflare's AI security camera: The large-language models powering the generative AI revolution were largely trained by scraping public web content, and for the most part site operators had no way of knowing whether or not they were participating in their own demise. As part of its annual birthday week of product launches, Cloudflare unveiled an AI-scraping detection tool that its customers can use to understand how their sites are being crawled by AI model-training companies.
As Cloudflare put it, "until recently, bots and scrapers on the Internet mostly fell into two clean categories: good and bad," but "the rise of AI Large Language Models (LLMs) and other generative tools created a murkier third category." Customers using the new tool will be able to block AI bots entirely, select which ones they will permit to crawl their sites, and even monetize that activity by setting a price for crawling.
System Initiative goes live: Allowing infrastructure teams to organize and manage their hardware and networks using code led to enormous breakthroughs in productivity and resilience that paved the way for modern cloud computing, and former Chef co-founder Adam Jacob was right in the middle of that revolution. Now with System Initiative, Jacob and team are trying to simplify managing the even-more complex environments of today through a new type of user interface that creates a literal picture of infrastructure resources across clouds and on-premises environments.
Runtime profiled the new company and product last year, and the 1.0 version of the product, which for now only supports cloud instances running on AWS, launched on Wednesday. “It’s a revolutionary technology that we think is the future of DevOps automation,” Jacob told The New Stack.
Observability from front to back: Honeycomb has been one of the leading startups dragging the old-school infrastructure monitoring world into the observability era, and this week it announced a new tool for tracking the performance of front-end web applications. Honeycomb for Frontend Observability is based around the OpenTelemetry project and promises to allow customers to get a full picture of how their users are interacting with their applications in browsers as well as how those applications are interacting with the server.
Also this week, Grafana made its Cloud Provider Observability service generally available, allowing customers to get a better picture of how their applications are running across multiple cloud environments. It also introduced a product in the public preview stage that allows customers to analyze application performance without having to know their way around specific query languages.
Allen Institute goes multimodal: As Allen Institute for Artificial Intelligence COO Sophie Lebrecht put it earlier this year in an interview with Runtime, if we have any hope of understanding how large-language models will evolve and make an impact on humanity it's important that the research community has access to truly open models that can be studied at length. This week Ai2 introduced Molmo, a multimodal version of its flagship Olmo model that was released with fully open weights, code, and training data.
"The best in class model within the Molmo family not only outperforms others in the class of open weight and data models, but also compares favorably against proprietary systems like GPT-4o, Claude 3.5 and Gemini 1.5," Ai2 said in a press release. As we await a decision from the Open Source Initiative on how the term "open source" will be applied to AI models, this is probably as good as it's going to get.
A conscious effort to encourage programming in languages like Go and Rust reduced the number of memory safety software vulnerabilities in Android from 76% to 24% over a six-year period, according to Google.
"Their departures made me think about the hardships parents faced in the Middle Ages when 6 out of 8 children would die prematurely. Despite the profound loss, the parents had to accept it and find deep joy and satisfaction in the 2 who survived." — OpenAI co-founder Wojciech Zaremba, commenting on the resignation of three OpenAI executives this week and taking that discussion to a place few expected.
The fallout from OpenAI's Wacky Wednesday continued to ripple through Silicon Valley, with Sam Altman denying he is in line to receive a "giant equity stake" while board members conceded that some sort of equity stake is being discussed, reports that tension between OpenAI's original goals and its emerging for-profit business plans led to several of the departures, and suggestions that the company is burning about $5 billion a year.
Thanks for reading — see you Tuesday!