Lemurian Labs: How This Company Is Solving AI’s Compute Capacity Limitations

By Amit Chowdhry • Jan 19, 2024

Lemurian Labs is a company focused on unleashing AI’s capabilities for humanity’s benefit. To fulfill this purpose, Lemurian Labs is developing a general-purpose AI accelerator capable of orders of magnitude better performance and efficiency than legacy processors while being designed for scalability. Pulse 2.0 interviewed Lemurian Labs CEO and co-founder Jay Dawani to learn more about the company.

Jay Dawani’s Background

Dawani majored in applied mathematics but has predominantly worked in artificial intelligence and robotics, trying to push the boundaries of both fields. And Dawani said:

“Before co-founding Lemurian Labs, I served in a number of leadership roles at companies, including BlocPlay and Geometric Energy Corporation, where I led projects that covered a broad spectrum of technologies, including quantum computing, metaverse, web3, gaming, AI, autonomous systems, algorithmic trading, and space robotics.”

“This experience has given me a unique perspective on the intersections of these fields and their potential for transformative change. Additionally, I’ve had the privilege of serving as an advisor to organizations like NASA Frontier Development Lab and SiaClassic, as well as collaborating with leading AI firms. Beyond my professional endeavors, I’m also the author of “Mathematics for Deep Learning,” which I wrote in an effort to lower the barrier of entry into the field by making the underlying math less intimidating.”

Formation Of Lemurian Labs

How did the idea for Lemurian Labs come together? Dawani shared:

“When my co-founder and I originally founded Lemurian Labs in 2018, we were working on a RoboOps platform and a foundation model for general-purpose autonomy to make it easier for the field to adopt AI. The problem we soon realized was that the amount of compute we would need to bring it to life was completely unattainable and that even if we somehow managed to get the immense amount of compute we needed, there was no edge processor that existed on which we could deploy this. This got us really thinking about the hardware challenges and why we don’t have better solutions.”

“We did a super deep dive into compilers and computer architectures and came to the realization that if we codesigned a general-purpose architecture with software, there was close to 1000X to be gained. With this insight, we started with the software stack, working our way down from pytorch and the workloads through the various lowerings in the compiler and then started thinking about the architecture for the accelerator based on how software wanted to run, and along the way, we created a new way of representing numbers which have higher information per bit relative to floating point, and it opens up a lot of opportunities in the design space for the computer architecture as well which is very fascinating.”

Favorite Memory

What has been your favorite memory working for the company so far? Dawani reflected:

“I don’t know if there’s a particular favorite memory that stands out. I have a love for problems and learning new things, and every day at Lemurian, I get to work with an incredible group of people who teach me new things. We’ve worked together on solving entirely new problems, which tend to create new problems, which lead to new and often surprising learnings.”

“When you’re helming a startup, you’re in essence trying to find the quickest path from chaos to clarity, and more often than not that means moving quickly from problem to problem and thinking less about what went right. But it’s always exciting and memorable when you come out the other end of a dark tunnel after spending a lot of time working on a hard problem like solving logarithmic addition or running workloads on our architecture simulation and getting results for throughput and efficiency, basically any time when we manage to demonstrate real and meaningful results.”

Core Products

Lemurian Labs

What are the company’s core products and features? Dawani explained:

“The way I see it, we have two products. One is our software stack which ingests pytorch models and can target heterogeneous compute, as in it can run on CPUs, GPUs, and our accelerator. Then we have our accelerated computing platform, the Spatial Processing Unit (SPU), which is still under development. This accelerator is designed around the wants of our software. A big focus for us in designing the SPU has been greater throughput from higher utilization and a massive reduction in power consumption.”

“A few of our key features and differentiators include:

1.) AI compiler: we’ve designed a full software stack from pytorch down that can run on heterogeneous clusters while giving developers insane speedups and productivity gains.

2.) PAL: PAL is the first fully logarithmic number format, and as a result of how it is designed, it can represent numbers and do arithmetic much more accurately and efficiently than floating point. This enables us to pack in a lot more math in the same area as others while consuming less energy, resulting in massive performance gains.

3.) SPU: our architecture in simulations has demonstrated it can achieve up to 20X the throughput of legacy GPUs in the same power envelope on benchmark workloads without sacrificing on programmability.

4.) Cost-Effectiveness: We’ve managed to achieve remarkable performance gains while reducing the cost to just 1/10th of the total cost of traditional hardware. This cost-effectiveness is a game-changer in terms of the economics of data centers for AI. This will make AI development accessible to a much broader range of companies and individuals.

5.) Environmental Sustainability: Our approach significantly reduces power consumption, contributing to a more eco-friendly approach to AI development. This achievement is particularly noteworthy in a world increasingly concerned about the environmental impact of technology.”

Challenges Faced

What challenges has Dawani faced in building the company? Dawani acknowledged:

“You mean other than trying to accomplish a moonshot on a shoestring budget? There are a lot of bottlenecks that the industry in general is facing from availability of compute and semiconductor supply chain challenges, but none of that is affecting us right now. If anything, I think the challenge is a general shortage of the kind of engineers we’re looking to hire.”

Evolution Of Lemurian Labs’ Technology

How has the company’s technology evolved since launching? Dawani noted:

“Things are always evolving, we’re always experimenting and learning and using what we’ve learned to improve our core technology. Our proprietary number format, PAL, has gone through several phases of improvement and we’re still discovering ways of making it better every day. Then we have our Spatial Processing Units, for which we had to address some key challenges that have held back dataflow processors in the past, and just recently, we found some new opportunities to improve it quite a bit more. A big focus for us right now is software and that is something we are insistent on getting right from the get-go, so most of our effort goes into this, and over the past year, we’ve had some exciting developments that we’re hoping to share next year.”

“We’re on a journey, and it takes time and effort and disciplined innovation to bring deeply technical products like this to the market. These advancements underscore our commitment to democratizing AI and redefining the economic landscape of AI development, ensuring broader accessibility for all. And I don’t believe that we’ll stop here, we’re only just beginning to scratch the surface of what we think is possible.”

Significant Milestones

What have been some of the company’s most significant milestones? Dawani cited:

“1.) Seed Funding: We recently secured $9 million in seed funding. This investment demonstrates strong investor support and underscores the significant interest and confidence in our vision for democratizing AI and redefining the economics of AI development.

2.) Performance Gain: One of our most notable achievements is the substantial performance gain we’ve achieved with our accelerated computing platform, up to 20 times greater throughput for AI workloads than legacy GPUs.

3.) Cost Reduction: We’ve achieved these remarkable performance gains while reducing the cost to just 1/10th of the total cost of traditional hardware. This cost-effectiveness is a game-changer, making AI development accessible to a broader range of companies and individuals.

4.) Environmental Sustainability: Our approach significantly reduces power consumption, contributing to a more eco-friendly approach to AI development. In a world increasingly concerned about environmental impact, this achievement is particularly noteworthy.

5.) Team Expertise: We’re proud of our team’s expertise, which includes professionals from industry giants like Google, Microsoft, NVIDIA, AMD, and Intel. Their experience and commitment are integral to the success and innovation at Lemurian Labs.

6.) Mission Impact: Our mission is to bring down the cost and energy requirements for AI through better software and hardware so that every person and company can fully leverage and benefit from the extraordinary potential of AI.”

Funding

After asking Dawani about the company’s funding, he revealed:

“As mentioned, Lemurian Labs recently secured $9 million in seed funding, led by Oval Park Capital, with significant participation from investors Good Growth Capital, Raptor Group, and Alumni Ventures, among others. This funding underlines the potential of our technology and its transformative impact on the AI industry. It plays a crucial role in advancing our mission to democratize AI. We’re grateful for our investors’ confidence in our vision and our team’s expertise, which includes professionals from industry giants. This funding will be instrumental in driving our mission forward and expanding AI’s benefits to a broader audience.”

“Our focus is on developing our core technology right now, and this round of funding will be pivotal in helping us achieve our goals. We’re excited about the journey ahead and the impact we can make on the AI landscape. While we can’t provide specific revenue metrics at this time, our progress in securing significant seed funding underscores our potential for growth and impact in the industry.”

Total Addressable Market

What total addressable market (TAM) size is the company pursuing? Dawani assessed:

“The TAM for AI is vast and continues to expand rapidly, driven by the increasing adoption of AI across various industries. In 2022, the global generative AI market was valued at $40 billion, with projections of it becoming a $1.3 trillion market by 2032.”

“We aim to democratize AI and make it accessible to a wider and more diverse audience. By providing a cost-effective, energy-efficient, and high-performance accelerated computing platform, we aim to capture a meaningful share of this expanding market.”

“While I can’t address a specific figure we’re pursuing, our goal is to address the current limitations and bottlenecks in AI technology, making it more accessible and affordable to a broader range of companies and individuals. This encompasses a substantial portion of the TAM as we work towards redefining the economics of AI development and reshaping the industry.”

Differentiation From The Competition

What differentiates the company from its competition? Dawani affirmed:

“What truly differentiates Lemurian Labs from our competitors is our commitment to versatility and accessibility. We’re on a mission to break away from the exclusivity that has long characterized AI technology. Our platform is designed to reach a broader, more diverse user base. This commitment to inclusivity is a key aspect that sets us apart from others in the field.”

“Through the combination of our work on compilers, number formats, and computer architecture we can deliver exascale machines at a price point and power envelope that is nothing short of a step change. You can think of it as almost being able to shrink down and fit 8 data centers into the one you already have, so you saved on building 7 new ones. And on top of that, we’re making it easier to program these massive machines because we want to improve the lives of developers.”

Future Company Goals

What are some of the company’s future company goals? Dawani concluded:

“Our goal is to create an open-source ecosystem to make our solution easily accessible and streamline integration for developers and engineers. This will begin with the release of our software stack, set to debut next year, making our solution more accessible and developer-friendly.”

“Simultaneously, we remain committed to refining and evolving our technology in sync with the dynamic AI industry. Our ongoing research and development efforts are focused on optimizing our platform, enhancing performance, and broadening its application to diverse use cases. We are determined to fortify our presence in the AI market, expanding our reach to a wider spectrum of businesses and organizations through strategic collaborations, partnerships, and outreach initiatives.”

“We’re actively exploring eco-friendly methods to reduce power consumption further and minimize our environmental impact, aligning with sustainability objectives. In line with our core mission of democratizing AI, we will persist in advocating for greater inclusivity and affordability in AI development, working towards a more diverse and accessible AI landscape where advanced technology is within reach for a broader audience.”