How Figma added scale and AI to its "multiplayer" design tool

Figma's collaboration tools are a hit with designers thanks to its decision to take a page from the gaming software playbook and rebuild its databases for "infinite scale."

Figma vice president of software engineering Abhi Mathur sits in a chair.
Figma vice president of software engineering Abhi Mathur. (Credit: Figma)

A decade ago at Facebook when the metaverse was still a literary device, Abhi Mathur's engineering teams built everything from databases to CDNs rather than rely on third-party vendors. Things work a little differently at Figma, which like most startups of its age left the undifferentiated parts of its tech stack to the experts and focused all its attention on building and maintaining the technology that made it special.

That approach has served Figma and Mathur, the company's vice president of software engineering, quite well, even if its proposed $20 billion merger with Adobe fell apart last year. Interest in Figma's team-oriented design tools has grown steadily since the company was founded in 2016 thanks in large part to a core platform Mathur compared to a "gaming engine," similar to how Unreal Engine's technology is what makes Fortnite so compelling.

It's not really a gaming engine, but Figma's "multiplayer editor" is how it describes the actual technology that allows designers to work together on a new web site or marketing campaign in the main Figma product, or collaborate on product vision and strategy using Figjam's digital whiteboard. Built in TypeScript and rewritten in Rust, the platform is designed to allow Figma users to simultaneously edit design documents without compromising performance, Mathur said.

"We want our experiences to be really snappy. Even on a client like a browser, you want it to feel like a very interactive, thin-client desktop-like experience," he said in a recent interview.

The core platform was also designed to make it easy for Figma to build new products on top of its existing investments, but Figma is very deliberative about how it adds those new features into the platform. 

"Some companies are like, 'quickly launch it and test it with the customers.' We spend an enormous amount of time on craft and quality," Mathur said. "One of the things that we have is if there are new bugs on existing features, we prioritize [fixing] them over building new functionality."

Over the horizon

Figma's collaboration platform runs primarily on AWS, although the company had used some of OpenAI's models for its early generative AI experiments. It is one of AWS's largest database customers, he said, and for years relied on a single Amazon RDS Postgres database to manage all the data associated with design files.

In 2020 it realized it needed something that could scale more efficiently, and built out dozens of different databases. But by 2022, even that was pushing the limits of what RDS supported, and Figma decided to implement horizontal sharding across that database to improve performance.

Horizontal sharding separates data into rows, grouping together several different data types around a specific label, such as individual Figma users. Until 2022, Figma's data was organized vertically, which meant all the images lived in one database while user authentication data lived in another.

The database upgrade took 18 months but has left Figma in a place where it is "theoretically infinitely scalable," Mathur said. "Having pragmatic approaches to scale is something that we have thought about, and all of this has to happen while maintaining the foundation of reliability and performance, and, of course, expenses and costs."

Managing expectations

Figma worked closely with AWS on the database project, which led Mathur to appreciate the benefits of allowing partners to do some of the heavy lifting rather than directing every aspect of the project in-house, which for the most part is what Facebook did during his tenure there.

"For example, we don't want any managed services for the multiplayer because it is so unique and core to [the experience], that we want to have tighter control," he said. "But there are services that are a commodity and can be replaced there by lots of managed services, where I don't have to spend time thinking about those, instead we can think about more creative solutions that are unique to us."

When it comes to generative AI, most enterprise tech companies are still trying to figure out which AI experiences are commodities and which ones will deliver actual differentiated value. Mathur is a big believer that the technology will help improve the Figma experience over time so long as it is built in a way that works across all its products and services, not just bolted on in hasty fashion.

"Our company's mission is to make design accessible to everyone, and I strongly believe AI can support that [mission], it can make designers much more productive and non-designers less scared of design," he said. 

Right now, Figma is looking at generative AI technology at the intersection of design and development, helping its customers get designs built atop the service into working code for their own products more quickly and reliably.

"There is still a lot of friction from imagination to design, designs often don't look like what is actually delivered," Mathur said. "How can we make some of these handoffs much easier using Dev Mode? How can we convert designs into more reusable code, which is sort of the direction in which we are moving."

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Runtime.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.