Release the llamas

Today: why Meta wants (almost) everyone to use its large-language model, Microsoft tries to inspire its partner community, and the latest funding rounds for enterprise tech startups.

Release the llamas
Photo by Liudmila Shuvalova / Unsplash

Welcome to Runtime! Today: why Meta wants (almost) everyone to use its large-language model, Microsoft tries to inspire its partner community, and the latest funding rounds for enterprise tech startups.

(Was this email forwarded to you? Sign up here to get Runtime each week.)


Move deliberately and "open" things

While the history books written about the Facebook/Meta conglomerate will likely focus on different things, the company has been a consistent supporter of open-source technology in hardware and software over the last 15 years. It has a twist on that strategy for the generative AI era, which could give companies and governments around the world a foundation on which to build the AI services of your dreams and nightmares.

Meta announced the release of its Llama 2 foundation model Tuesday, allowing the technology to be used for commercial purposes under certain conditions. Earlier this year it opened up the model to academic research, and despite pushback from critics who felt it was moving too fast signaled plans to expand its distribution in May.

  • "Giving businesses, startups, entrepreneurs, and researchers access to tools developed at a scale that would be challenging to build themselves, backed by computing power they might not otherwise access, will open up a world of opportunities for them to experiment, innovate in exciting ways, and ultimately benefit from economically and socially," the company said in a blog post.
  • The release was announced in partnership with Microsoft at its Inspire conference (more on that below) and it will be available on both Azure and AWS.
  • That Meta is now in direct competition (of sorts) with Microsoft's close partner OpenAI doesn't seem to have fazed anyone.
  • The new version was trained on 40% more data than its predecessor, and Meta will also release the model weights along with the code.

Meta has shared the fruits of its internal research and development with the public several times in its history.

  • The most significant of those efforts was probably the Open Compute Project, which provided a blueprint for building more powerful and more efficient data centers at a time when that information was considered a core secret by the hyperscalers.
  • More recently, the company released React, a popular user-interface development tool, and PyTorch, a group of tools for machine-learning development.

However, it's not clear exactly how "open" Llama 2 will be.

Still, OpenAI's GPT models deserve competition, and while it might not be pure open source, Llama 2 will make it cheaper for companies to jump on the generative AI train.

  • There's also a good startup opportunity here to provide support and services around Llama 2, as tends to happen around any popular but complicated open-source project.
  • Critics of the idea of open-sourcing these models in general worry that with few restrictions, they could be used by any number of bad actors for applications that could be prohibited under a regular commercial arrangement.
  • Others believe having the code behind these projects out in the open will actually make it easier to understand what they're capable of doing and react accordingly.

It's becoming clear that there's still a long way to go before enterprise tech figures out how best to utilize these technologies.

  • The number of both commercial and legit open-source models arriving almost weekly means there will be a lot of options for companies to sort through.
  • And integrating those models with corporate data will be harder than it sounds, as companies start to realize their approach to data management is outdated.
  • But the building blocks are coming together that will allow businesses of all sizes to chart their own AI path.

Howdy partner

The Meta partnership was certainly the headline news at Microsoft's Inspire conference Tuesday, but the company also revealed a little bit more about how it plans to pay for its massive investments in OpenAI.

Partners are an enormous part of every enterprise tech company's distribution strategy, and it's especially true for Microsoft given its longevity in this market. A year after rebranding its partner program to focus on Azure, Microsoft is now calling it the "AI Cloud Partner Program," and will roll out incentive packages to encourage partners to sell the fancy new AI stuff.

It also set the price for Microsoft 365 Copilot, which will probably be one of the first entry points to generative AI for the average office worker, at $30 per user per month. That's kind of a lot, considering the entire Office suite costs around or less than that per month depending on which plan your company's on.


Enterprise funding

Netcraft landed $100 million in private-equity funding, the first funding round raised by the cybersecurity company since it was founded in 1995.

Wing Cloud emerged from stealth mode with $20 million in seed funding to support the development of its new programming language for managing cloud infrastructure.


The Runtime roundup

Adobe and Citrix customers kicked off the week rushing to patch separate zero-day flaws disclosed by Rapid7, according to Ars Technica.

Google plugged a security hole in Cloud Build that could have led to the next software supply-chain security attack.

A Microsoft Azure Container Apps update went haywire, preventing some users of the tool from seeing log data.


Thanks for reading — see you Thursday!

Correction: This newsletter was updated to reflect that Llama 2 is available on AWS today.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.