ChatPaper.aiChatPaper

RexBERT:面向电商领域的上下文定制化双向编码器

RexBERT: Context Specialized Bidirectional Encoders for E-commerce

February 4, 2026
作者: Rahul Bajaj, Anuj Garg
cs.AI

摘要

在检索、分类和排序等对延迟性、稳定性和成本要求极高的系统中,仅编码器架构的Transformer模型仍不可或缺。然而,通用编码器大多基于通用语料库训练,对专业领域的覆盖有限。我们推出RexBERT系列——专为电子商务语义设计的BERT风格编码器模型,并做出三项贡献。首先,我们发布Ecom-niverse语料库,这是一个从多元零售与购物数据源精选出的3500亿词元语料库。我们提出模块化流水线,可从FineFineWeb等开放网络资源中隔离提取电商内容,并量化分析所得语料的领域分布特征。其次,基于ModernBERT的架构创新,我们提出可复现的预训练方案,包含三阶段训练策略:通用预训练、上下文扩展及退火领域专项化。最后,我们训练参数量从1700万到4亿不等的RexBERT模型,并基于电商数据集在词元分类、语义相似度和通用自然语言理解任务上开展评估。实验表明,尽管参数量减少2-3倍,RexBERT在领域专项评测中不仅超越更大规模的通用编码器,还可媲美或优于现代长上下文模型。我们的结果证明:高质量领域内数据与原则性训练方法的结合,能为电商应用提供比盲目扩大模型规模更坚实的基础。
English
Encoder-only transformers remain indispensable in retrieval, classification, and ranking systems where latency, stability, and cost are paramount. Most general purpose encoders, however, are trained on generic corpora with limited coverage of specialized domains. We introduce RexBERT, a family of BERT-style encoders designed specifically for e-commerce semantics. We make three contributions. First, we release Ecom-niverse, a 350 billion token corpus curated from diverse retail and shopping sources. We describe a modular pipeline that isolates and extracts e-commerce content from FineFineWeb and other open web resources, and characterize the resulting domain distribution. Second, we present a reproducible pretraining recipe building on ModernBERT's architectural advances. The recipe consists of three phases: general pre-training, context extension, and annealed domain specialization. Third, we train RexBERT models ranging from 17M to 400M parameters and evaluate them on token classification, semantic similarity, and general natural language understanding tasks using e-commerce datasets. Despite having 2-3x fewer parameters, RexBERT outperforms larger general-purpose encoders and matches or surpasses modern long-context models on domain-specific benchmarks. Our results demonstrate that high quality in-domain data combined with a principled training approach provides a stronger foundation for e-commerce applications than indiscriminate scaling alone.
PDF11February 6, 2026