ChatPaper.aiChatPaper

Janus:解耦注意力與專家機制實現可擴展MoE推理

Janus: Disaggregating Attention and Experts for Scalable MoE Inference

December 15, 2025
作者: Zhexiang Zhang, Ye Wang, Xiangyu Wang, Yumiao Zhao, Jingzhe Jiang, Qizhen Weng, Shaohuai Shi, Yin Chen, Minchen Yu
cs.AI

摘要

大型专家混合模型(MoE)的推理因资源需求高且工作负载动态变化而充满挑战。现有解决方案通常将整个模型部署为单一整体单元,对注意力模块和专家模块采用统一的资源配置,忽视了二者的差异化需求,导致可扩展性受限和资源效率低下。本文提出Janus——一种可扩展的MoE推理系统,通过将注意力模块与专家模块解耦部署至独立的GPU子集群,实现各模块的独立管理与弹性扩缩。Janus包含三项关键设计以实现高效解耦的MoE推理:首先,提出自适应两阶段通信机制,利用节点内与节点间带宽层级实现低延迟数据交换;其次,针对MoE模块的内存瓶颈特性,设计轻量级调度器并以GPU内核形式实现,以最小开销均衡跨GPU的激活专家数量,从而降低推理延迟;最后,通过细粒度资源管理动态调整专家布局,并独立扩展注意力与MoE资源以提升整体效率。实验表明,在满足单令牌延迟要求的前提下,Janus相比现有最优系统可实现最高3.9倍的单GPU吞吐量提升。
English
Large Mixture-of-Experts (MoE) model inference is challenging due to high resource demands and dynamic workloads. Existing solutions often deploy the entire model as a single monolithic unit, which applies a unified resource configuration to both attention and expert modules despite their different requirements, leading to limited scalability and resource inefficiency. In this paper, we propose Janus, a scalable MoE inference system that disaggregates attention and experts on separate GPU sub-clusters, enabling each module to be managed and scaled independently. Janus incorporates three key designs for efficient, disaggregated MoE inference. First, it proposes an adaptive two-phase communication scheme that exploits intra- and inter-node bandwidth hierarchies for low-latency data exchange. Second, motivated by the memory-bound nature of MoE modules, Janus introduces a lightweight scheduler and implements it as a GPU kernel to balance the number of activated experts across GPUs at minimal overhead, thereby reducing inference latency. Third, Janus performs fine-grained resource management to dynamically adjust expert placement and independently scale attention and MoE resources to improve overall efficiency. Evaluation shows Janus achieves up to 3.9 higher perGPU throughput than state-of-the-art systems while meeting per-token latency requirements.
PDF51December 18, 2025