Lemurian Labs has secured $28 million in an oversubscribed Series A round to accelerate the rollout of its software-first platform designed to make artificial intelligence run faster, more efficiently, and on any hardware. The funding includes capital previously raised through convertible securities and positions the Santa Clara-based company to expand engineering efforts, advance product development, and deepen collaboration with ecosystem partners focused on sustainable compute and open AI innovation.
The company is building a universal, hardware-agnostic compute fabric that allows developers to write AI code once and deploy it across cloud, edge, and on-premises environments without redesigning or relying on proprietary stacks. Lemurian’s approach aims to eliminate vendor lock-in, increase enterprise flexibility, and reduce the cost and complexity of scaling AI workloads. By rebuilding the software layer to optimize performance across heterogeneous hardware, the company aims to address rising energy demands and deployment constraints in the global AI ecosystem.
The round was co-led by Pebblebed Ventures and Hexagon, with participation from Oval Park Capital, Origin Ventures, Blackhorn Ventures, Uncorrelated Ventures, Untapped Ventures, Planetary Ventures, 1Flourish Ventures, Animal Capital, Stepchange VC, and Silicon Catalyst Ventures. The founders and leadership team bring experience from organizations such as NVIDIA, Qualcomm, Sun Microsystems, IBM, and Intel.
Investors point to the rising energy footprint of AI as a driving force behind Lemurian’s approach. Some projections suggest AI workloads could consume 20% of global electricity by the early 2030s, a trend accelerated by closed, vertically integrated software stacks that limit flexibility and efficiency. Lemurian’s model introduces an open and software-centric alternative intended to deliver faster deployment and more responsible compute at scale.
The new funding will support Lemurian Labs in advancing its technology platform and broadening market reach as organizations seek more efficient ways to scale AI infrastructure.
KEY QUOTES:
“Scaling AI is the next frontier, but that’s not possible on platforms designed for yesterday’s workloads. For decades, faster chips delivered ‘free gains,’ but now the real bottleneck is software. Lemurian is rebuilding the software stack from the ground up to eliminate vendor lock in, control costs and give developers the flexibility to run AI anywhere on their terms.”
Jay Dawani, Co-Founder and CEO, Lemurian Labs
“Lemurian is reframing the grim choice that AI’s hardware software interface has forced on users: choosing between vendor locked vertical stacks or brittle, rewrite prone portability. Jay and his team bring technical virtuosity that lets you run your AI code as written, on whatever hardware makes sense full stop, no compromises.”
Keith Adams, Founding Partner, Pebblebed Ventures
“Everyone in AI wants to see healthy competition in the GPU market to accelerate innovation. But in order for that to happen, someone has to develop CUDA like software for a wide range of GPUs and other processors, which is difficult; it’s why I was excited to invest in Lemurian Labs.”
Salil Deshpande, General Partner, Uncorrelated Ventures

