1. Home
  2. Companies
  3. d-Matrix
D-

d-Matrix

About

d-Matrix is a semiconductor and computing platform company founded in 2019 and focused exclusively on generative AI inference. Rather than adapting existing GPU architectures, the company built its platform from scratch around a core technical bet: that the principal constraint in running large AI models at scale is not raw compute power, but the movement of data between memory and processors. Its answer is digital in-memory compute, a design approach that brings computation closer to where data is stored.

The company's flagship product, Corsair, is a computing platform engineered specifically for inference workloads - the process of running a trained AI model to generate outputs - rather than for training. Corsair is designed to deliver low latency and high throughput while keeping energy consumption and costs at levels that make large-scale generative AI commercially viable for a broader range of organisations, not only the largest technology companies.

d-Matrix has grown from a small founding team to more than 200 people. Its engineering work spans silicon design, software, and systems engineering, with a stated emphasis on first-principles thinking applied to hard problems at the intersection of all three disciplines. The company operates in the AI infrastructure space, targeting cloud and enterprise inference at scale.

Similar companies

FR

Fractile

Fractile is a UK-based semiconductor company building AI acceleration hardware to radically improve frontier model inference performance.

2 jobs
CS

Cerebras Systems

Cerebras Systems designs wafer-scale AI chips and supercomputers for high-speed machine learning training and inference, serving enterprises, national labs, and healthcare systems.

SS

SambaNova Systems

SambaNova Systems builds full-stack AI infrastructure with custom chips and software, enabling enterprises and governments to deploy generative AI on-premises with full data control.

FA

Fireworks AI

Fireworks AI is a generative AI inference platform that delivers the fastest performance for open-source LLMs and image models, enabling developers to build, fine-tune, and scale production-ready AI applications.

CO

CoreWeave

CoreWeave is an AI-native cloud platform providing specialized GPU infrastructure for training and deploying AI workloads at scale.

VD

VAST Data

VAST Data builds a unified AI Operating System integrating storage, database, and compute to support large-scale AI and deep learning workloads across edge and cloud environments.