What happened
Anthropic, developer of Claude AI, is in early discussions with UK startup Fractile regarding its Memory Compute Fusion Architecture. Fractile's SRAM technology claims to boost AI inference speed by 100 times and reduce costs by 10 times, by processing data within the chip to lower reliance on off-chip memory. Anthropic currently sources chips from NVIDIA, Google, and Amazon. Fractile has not yet designed test chips.
Why it matters
This potential move signals a strategic shift towards custom silicon for AI inference, offering significant unit economic advantages. For infrastructure architects and procurement teams, Fractile's claimed 100x speed increase and 10x cost reduction could drastically alter compute strategy, reducing dependence on general-purpose GPUs. This aligns with a broader industry trend where major AI labs increasingly pursue specialised hardware to manage escalating compute demands and control soaring inference costs.




