Monster API is a company that builds super-simple and ultra-slick APIs that makes it easier to launch your applications in a scalable and affordable way. Pulse 2.0 interview Monster API CEO Saurabh Vij to learn more.
Saurabh Vij’s Background
Vij was a Particle Physicist and has excellent experience in building and running large computing systems to crunch numbers for physics. Vij said:
“Also, I ran two startups before starting Monster API with my brother Gaurav.”
“One of the startups I founded was inventing a wearable safety device for women in India and collaborated with the Government of India and IIT Delhi to build this.”
“Gaurav has been a Machine learning researcher and has led his startup in computer vision where he personally faced this challenge of training AI models, costing him thousands of dollars every month along with the challenges of setting up these complex and expensive GPU systems.”
Formation Of MonsterAPI
How did the idea for MonsterAPI come together? Vij shared:
“Since Gaurav was bootstrapping his previous startup and spending exorbitant amounts on AWS every month for training these models, he discussed this issue with me and I shared the idea of distributed computing from the physics world that he witnessed.”
“Projects like LHC@home, SETI@home, Folding@home have been leveraging distributed computing to power these mega compute hungry projects for over two decades now.”
“It’s all about getting idle compute power from the millions of under-utilized laptops and gaming systems all over the world.”
“The core idea came from a few exciting questions my brother, and I asked:
— Why can’t we use this concept for machine learning?
— What technical change in the last five years can enable this?
— Can this help millions of other developers like us?
We had some experience in building crypto mining rigs at home, so that was our first attempt/experiment.”
“We tried to train an ML model on the GPUs inside a crypto rig. This didn’t work the first time as it requires a lot more optimization to handle ML workloads because, by default, they are designed for gaming, not AI or ML.”
“So, we built our own ML package that adds all the required drivers, libraries, AI frameworks, and containers, and after a few iterations, we managed to make it work.”
“Gaurav started using this approach, and his monthly bills dropped from thousands of dollars to $200/month.”
“That was the moment we realized that this can help millions of developers like us and started building the Monster API to solve all the different problems to make this work at scale.”
Favorite Memory
What has been your favorite memory working for MonsterAPI so far? Vij reflected:
“The previous answer is one of our favorite memories.”
“Another is when we launched a hardware version prototype of this at CES 2019.”
Challenges Faced
What challenges did Vij face in building the company, and has the current macroeconomic climate affected the company? Vij acknowledged:
“There have been many challenges:
1.) Getting great talent when you are a non-funded startup with a contrarian idea is very challenging
2.) Convincing businesses to shift their workloads from AWS to a new method and unknown startup was a big challenge. We countered this by bringing a lot of case studies and benchmarking to validate the thesis that this decentralized approach to compute can deliver ten times better price/performance.
3.) Keeping up the speed of innovation: Bringing more solutions to different parts of the value chain.
4.) Security, fault tolerance, reliability: Each of these required a lot of technical effort.”
“The current macroeconomic climate is forcing a lot of businesses and startups to reconsider their current cloud choices, and thus they have started looking for better, more affordable alternatives.”
“Taking this perspective into account, we are going vertically deep in ML and reducing cost at each part of the ML pipeline to offer a 10 times more affordable solution.”
Core Products
What are MonsterAPI’s core products and features? Vij explained:
MonsterAPI core products include:
1.) Generative AI APIs: Access the latest AI models like Llama 2, Stable Diffusion XL, Whisper etc with our APIs
2.) No-code LLM finetuner: Fine-tune the latest LLMs with ease without burning a hole in your pocket
3.) GPU infrastructure access: Access the latest and most powerful GPUs at a fraction.
Features:
1.) Access to AI models via affordable APIs that scale on-demand
2.) Optimised models running on decentralized GPU network to reduce cost
3.) UI-driven LLM fine-tuning pipeline to reduce developer burnout with no-code implementation
4.) LLM fine-tuning to reduce the time and cost of processing
5.) Scalable, on-demand access to affordable GPUs from small data centers and crypto mining farms – Leading to more GPU optionality (from 8GB to 80GB GPUs available).
(More details in this press release)
Evolution Of MonsterAPI’s Technology
How has MonsterAPI’s technology evolved since launching? Vij noted:
“Initially, the APIs were not scalable. Now we have implemented a scalable orchestrator which handles the load on demand.
1.) The models were not optimized before deployment initially. Now they are optimized, resulting in a very low cost per inference request.
2.) There was no LLM finetuning solution initially. We built that ground up on our decentralized infrastructure platform to handle LLM fine-tuning jobs from developers on-demand and at scale in a very optimized method. Three core differentiators:
a.) Running this on decentralized infrastructure
b.) No-code UI: Lego sets kind of solution where developers can plug and play models and data sets to get started with fine-tuning
c.) Completely optimized pipeline: Reduced cost + Abstraction of complex LLM fine-tuning setup.”
Significant Milestones
What have been some of MonsterAPI’s most significant milestones? Vij cited the following:
“Over $2 million savings delivered to our users while serving them 700,000 compute hours.”
“We also recently closed our $1.1 million pre-seed round led by Carya Ventures from the Bay Area.”
Customer Success Stories
Can you share any specific customer success stories? Vij highlighted this example:
“One of our customers was spending over $300K annually on AWS for these GPUs. With Monster API, today they just spend $60K annually while getting the same number of GPUs and computing flops but also getting the benefit of an easy and intuitive UI compared to confusing and clunky UI experience on traditional cloud platforms.”
“This simplicity saves them another cost: Fewer number of MLops engineers.”
Funding
How much funding has the company raised? Vij revealed that the company had raised $1.1 million.
Total Addressable Market
What total addressable market (TAM) size is MonsterAPI pursuing? Vij assessed:
“More than 30 Million developers building AI-powered applications and fine-tuning AI models.”
“This number will become 100M by 2030 as AI becomes easier and more affordable to integrate.”
Differentiation From The Competition
What differentiates MonsterAPI from its competition? Vij affirmed the following:
1.) Low-cost GPU infrastructure
2.) Full ML stack, thus a developer can launch an instance within a minute with no need to set up the GPU systems with containers, Libs, AI frameworks, etc as we provide pre-configured instances
3.) Optimised ML models: We don’t offer plain open source models, we optimize them, which enables them to deliver more throughput and run on lower-cost consumer GPUs, leading to 90% reduction in some cases
5.) No-code fine tuner: Fine-tuning is becoming increasingly popular, but still, a developer faces many challenges. With our no-code fine-tuner, developers can get started today, and since this runs on our unique infrastructure, it leads to much better pricing for them–80-90% cost reduction.
Future Company Goals
What are some of MonsterAPI’s future company goals? Vij concluded:
“Our vision is to bring AI to 8 billion people by 2030. It would require a monstrous effort and help from brilliant innovators and developers.”
“We want to serve 100 million Monsters/developers that would develop AI-powered applications that will touch the lives of 8 billion people on the planet.”