张量逻辑:人工智能的语言
Tensor Logic: The Language of AI
October 14, 2025
作者: Pedro Domingos
cs.AI
摘要
人工智能的发展因缺乏具备所有必要特性的编程语言而受阻。诸如PyTorch和TensorFlow等库虽提供了自动微分和高效的GPU实现,但它们只是Python的附加组件,而Python本身并非为AI设计。这些库在自动推理和知识获取方面的支持不足,导致了一系列冗长且代价高昂的临时性尝试来弥补这些缺陷。另一方面,像LISP和Prolog这样的AI语言则缺乏可扩展性和对学习的支持。本文提出了张量逻辑,这是一种通过从根本上统一神经与符号AI来解决这些问题的语言。张量逻辑的唯一构造是张量方程,其基于逻辑规则与爱因斯坦求和本质上为同一操作的观察,且所有其他内容均可简化为这两者。我展示了如何在张量逻辑中优雅地实现神经、符号及统计AI的关键形式,包括变换器、形式推理、核机器和图模型。最重要的是,张量逻辑开启了新的研究方向,如在嵌入空间中进行可靠推理。这结合了神经网络的可扩展性与学习能力,以及符号推理的可靠性与透明性,有望成为推动AI更广泛采纳的基础。
English
Progress in AI is hindered by the lack of a programming language with all the
requisite features. Libraries like PyTorch and TensorFlow provide automatic
differentiation and efficient GPU implementation, but are additions to Python,
which was never intended for AI. Their lack of support for automated reasoning
and knowledge acquisition has led to a long and costly series of hacky attempts
to tack them on. On the other hand, AI languages like LISP an Prolog lack
scalability and support for learning. This paper proposes tensor logic, a
language that solves these problems by unifying neural and symbolic AI at a
fundamental level. The sole construct in tensor logic is the tensor equation,
based on the observation that logical rules and Einstein summation are
essentially the same operation, and all else can be reduced to them. I show how
to elegantly implement key forms of neural, symbolic and statistical AI in
tensor logic, including transformers, formal reasoning, kernel machines and
graphical models. Most importantly, tensor logic makes new directions possible,
such as sound reasoning in embedding space. This combines the scalability and
learnability of neural networks with the reliability and transparency of
symbolic reasoning, and is potentially a basis for the wider adoption of AI.