Metal: Helping Developers Build Production-Grade LLM Apps Intended For Real-World Use

By Amit Chowdhry ● Aug 18, 2023

Metal is a fully managed service that enables developers to build LLM applications. Metal removes the complexity of building with the LLM stack, providing data transformation, infrastructure, querying, memory, APIs, etc. Pulse 2.0 interviewed Metal co-founder Taylor Lowe to learn more.

Metal team

Pictured: Metal founders Sergio Prada, James O’Dwyer, and Taylor Lowe

Taylor Lowe’s Background

Lowe and his co-founders have worked in enterprise SaaS much of their  careers and built products together for the past five years. Lowe said:

“Much of that time has been focused on developer tools and infrastructure. My co-founder James O’Dwyer previously worked on machine learning infrastructure at Spotify, and our other co-founder Sergio Prada was on the serverless observability team at Datadog. The three of us met a startup that was later acquired by Meta, where we worked on machine learning products and the developer apps platform.”

Formation Of Metal

How did the idea for Metal come together? Lowe shared:

“We first started working in ML at our previous startup. For context, BERT had just come out, and we were tasked with building some of the first ML products at the company. The tech was so raw – so cumbersome. It was a far cry from what we were used to working within a ‘mature’ stack. Timely releases and iteration were very difficult.”

“For example, it took us around six months to get a simple classification feature out the door. We built a data ingestion pipeline from scratch, had to maintain live datasets for each customer, and built real-time observability features…today you can build this feature in Metal in a few hours. But it was a great learning experience working with the raw metal of ML (excuse the pun). The developer experience was awful, and we really had to grind to get these features in front of customers. That stuck with us.”

“Fast forward to winter of 2022 and we were getting ready to join the YC W23 batch. We thought a lot about our painful experiences working with ML. We realized that with the explosion in AI and large language models (LLMs), there were going to be many more development teams about to experience what we went through. The majority of software engineers today have little hands-on machine learning experience, yet their organizations are asking them to implement AI and LLM features. We knew we wanted to build a tool to help developers avoid the pain we experienced — and even better, to make this technology a joy to work with for software engineers.”

Favorite Memory

What has been Lowe’s favorite memory working for Metal so far? Lowe reflected:

“There are a few! Recently, we put out our first case study. Proof points like this are so important in the early life of a company. This one triple whammy: 1.) it shows how you can use our product; 2.) it validates that people actually use it to solve problems; and 3.) it helps promote a customer (Lastro) who has been a great partner and team to work with.”

Challenges Faced

What challenges have Metal’s founders faced in building the company, and has the current macroeconomic climate affected the company? Lowe acknowledged:

“Something we learned early on is that since AI and LLMs are still so new to so many people, the average customer needs more support and expertise when getting started. We initially offered Metal as a pure SaaS offering, but now provide super hands-on support to help enterprises get their applications out the door. While this may require more resources from our team upfront, it means more customers will be successful as the market learns to work with these new tools. Over time we expect this to change as best practices and patterns emerge in the stack, but right now we are happily getting our hands dirty for customers who need it.”

Core Products

 

What are Metal’s core products and features? Lowe concluded:

“Metal helps developers build production-grade LLM applications that are intended for real-world use. We are a fully managed service that handles data transformation, storage, indexing, and retrieval. This removes complexity for developers who just want to build and ship applications for their users. For example, semantic search and data used to power LLM chatbots are out of the box. And to help developers learn and iterate quickly, we provide observability into your application’s usage and performance.”

Exit mobile version