Subquadratic: $29 Million Seed Raised For Long-Context AI Architecture

By Amit Chowdhry • Today at 12:17 PM

Subquadratic, an AI research company developing long-context large language model architectures, announced the launch of SubQ 1M-Preview, which it describes as the first fully subquadratic large language model architecture. The company also disclosed $29 million in seed funding from investors including Javier Villamizar, Justin Mateen, Grant Gittlin, and Jaclyn Rice Nelson, alongside early investors in Anthropic, OpenAI, Stripe, and Brex.

According to the company, traditional transformer-based AI architectures require compute that scales quadratically as context windows grow, creating cost and performance limitations for long-context AI applications. Subquadratic said its architecture instead scales linearly with context length, enabling larger context windows, lower inference costs, and improved efficiency.

The company said its research model supports up to 12 million tokens of context while reducing attention compute requirements by nearly 1,000 times compared to other frontier AI models.

Subquadratic also announced three products entering private beta: a full-context API for developers and enterprises, SubQ Code for repository-scale coding workflows, and SubQ Search for long-context AI-powered research and search applications.

According to benchmark results cited by the company, SubQ 1M-Preview achieved 95% accuracy on the RULER 128K benchmark and produced competitive results against models including Anthropic’s Claude Opus, OpenAI’s GPT 5.5, and Google’s Gemini models on long-context reasoning evaluations.

Subquadratic was founded by Justin Dangel and Alex Whedon. Dangel is a five-time founder and CEO with experience across health technology, insurance technology, and consumer products companies. Whedon previously worked as a software engineer at Meta and later served as head of generative AI at TribeAI, where he led enterprise AI implementations.

The company said its broader research team includes PhD researchers and engineers from organizations including Meta, Google, Oxford, Cambridge, ByteDance, Adobe, and Microsoft.

According to the company, the architecture is designed to support AI systems capable of processing entire codebases, large document collections, spreadsheets, databases, and persistent long-running interaction histories within a single context window.

KEY QUOTES:

“Transformers defined the last decade of AI, yet one fundamental limitation has shaped everything built on top of them: compute requirements scale quadratically with context length.”

“SubQ 1M-Preview is the first LLM built on a fully subquadratic architecture, one where compute grows linearly with context length.”

“The most valuable applications of AI remain unbuilt because the existing architecture can’t support them. SubQ changes that and we’re at the beginning of understanding what becomes possible when the architecture stops getting in the way.”

Justin Dangel, Co-Founder & CEO, Subquadratic