ChatPaper.aiChatPaper

The Serial Scaling Hypothesis

July 16, 2025
Authors: Yuxi Liu, Konpat Preechakul, Kananart Kuwaranancharoen, Yutong Bai
cs.AI

Abstract

While machine learning has advanced through massive parallelization, we identify a critical blind spot: some problems are fundamentally sequential. These "inherently serial" problems-from mathematical reasoning to physical simulations to sequential decision-making-require dependent computational steps that cannot be parallelized. Drawing from complexity theory, we formalize this distinction and demonstrate that current parallel-centric architectures face fundamental limitations on such tasks. We argue that recognizing the serial nature of computation holds profound implications on machine learning, model design, hardware development. As AI tackles increasingly complex reasoning, deliberately scaling serial computation-not just parallel computation-is essential for continued progress.

PDF81July 22, 2025