Welcome to Runtime! Today: Redpoint's Scott Raney on enterprise spending, AI startups, and open-source software, GitHub launches an enterprise-friendly Copilot, and the latest enterprise moves.
(Was this email forwarded to you? Sign up here to get Runtime each week.)
It became clear in 2023 that the incredible growth in enterprise tech spending during the early years of the pandemic was a mirage, rather than a new baseline. Layoffs, spending cuts, and the quiet desperation of countless startups trying to justify their early 2020s valuations were everywhere this year, and Redpoint's Scott Raney isn't sure how quickly the good times will return.
"I expect things to get better from here, but I don't expect things to snap back," Raney said in a recent interview. "I feel like I'm the old man when I say this, but there are a lot of people who just haven't been around for that long who are just waiting for it to get back to where it was, and it's probably not going to get there anytime soon."
Some excerpts from that interview follow below.
On enterprise tech spending:
Enterprise buyers have kind of dramatically cut back on their spend. We saw at kind of the peak of the bubble (that) spend on IT was pretty profligate. There were lots of things going on, enterprises throwing around a lot of money at a lot of different projects. And they've tightened their belts.
Maybe there used to be 10 big priorities, and there used to be money that flew down to a bunch of different functional groups. Now, I think a lot of that spigot's been cut off and that functional groups and IT groups have probably trimmed back their priorities meaningfully. And you're either one of those priorities or you're not.
It's been, by and large, a pretty challenging time for a lot of these companies. And I don't even think we felt the real pain yet. Because most of these companies are sitting on top of so much money that (they're) not faced with the existential threat of going out of business or facing really, really difficult financings. That reckoning is likely to come at some point next year. And I think that'll be a difficult time.
On AI startups versus incumbents:
It's about the blocking and tackling and the execution of building a world-class product. And I just happen to believe that time and time again, it's been shown that a small group of really talented people focused on doing one thing and aligning an entire company around doing that ultimately would build a better product.
One of the areas in which we perceive there to be a lot of opportunity is going to be in the picks and shovels that are going to be required to allow developers to build world-class AI applications. Going after enterprise developers and saying, "what can we do to help you take advantage of this wave of AI to build applications that will allow you to extract the value that is going to be created to benefit your business."
What's happened is there's gradually been a move away from foundation movements, foundation open source — like the Linux Foundation, where software is donated to a foundation and then businesses are built around that — to company-driven open-source projects. If we go back and look recently at the Confluents of the world, the Elastics, the HashiCorp's of the world and others where companies are the driving force behind the open-source movement, in conjunction with a very vibrant and important community, it has just gotten to the point where there are different ways of thinking about the creation of these businesses and capitalizing on these opportunities.
I think the important thing here that we keep in mind is that it's open source, and developers still have the ability to use the software as an open-source piece of software. We're incredibly bullish about the future of open source and what it means, and I think the most important pieces of software that are going to be created that are going to fly in an enterprise infrastructure stack are going to be open source in nature.
Our code, our copilot
Almost every CIO and engineering manager I've talked to this year has either signed up their team for GitHub Copilot or at least kicked the tires on the AI coding assistant. This week GitHub announced a new service that could tempt those still standing on the sidelines.
GitHub Copilot Enterprise will allow companies to roll out a coding assistant that's trained on their own internal repository of code. That could allow new developers to get up to speed much faster, as most companies follow slightly different rules for writing code, and could also avoid any legal issues that might surface down the road from using code suggested by a LLM trained on public code.
At $39 a month per developer, Copilot Enterprise isn't cheap, but many of the companies already using Copilot are paying $19 a month per developer for Copilot for Business. Given how much money companies are already spending on developer time, Copilot Enterprise might be a much better use of that budget than the similarly priced Microsoft 365 Copilot for the rest of us office drones.
Brandon Sweeney is the new president and chief operating officer at dbt Labs, joining the company after serving as the chief revenue officer at HashiCorp for the last several years.
Motti Finkelstein is the new CIO at Intel, after joining the company last year.
Margaret Arakawa is the new chief marketing officer at IonQ, landing at the quantum computing company after serving in a similar role at Fastly.
Alexis Black Bjorlin has joined Nvidia to lead its DGX Cloud business, according to The Information, after two years leading Meta's infrastructure team.
The Runtime roundup
Microsoft briefly restricted the internal use of OpenAI's ChatGPT on company devices on Thursday, according to CNBC, despite having bet pretty much the entire future of the company on that technology.
Sumo Logic advised customers to reset their API keys after disclosing that it was hit with a security breach last week.
Twilio beat Wall Street expectations for profit and revenue and raised its guidance for the full year, which made the day traders happy.
Anthropic plans to use Google Cloud's newest TPU chips to train its Claude LLM, while also somehow running "the majority of its workloads" on AWS, according to Bloomberg.
Thanks for reading — see you Saturday!