AST-T5:面向代码生成和理解的结构感知预训练
AST-T5: Structure-Aware Pretraining for Code Generation and Understanding
January 5, 2024
作者: Linyuan Gong, Mostafa Elhoushi, Alvin Cheung
cs.AI
摘要
大型语言模型(LLMs)在与代码相关的任务中取得了重大进展,然而许多LLMs将代码视为简单序列,忽视了其结构化特性。我们引入了AST-T5,一种新颖的预训练范式,利用抽象语法树(AST)来增强代码生成、转译和理解能力。通过动态规划,我们的AST感知分割保留了代码结构,而我们的AST感知跨度破坏目标使模型能够重构各种代码结构。与其他模型不同,AST-T5避免了复杂的程序分析或架构更改,因此可以与任何编码器-解码器Transformer轻松集成。评估结果显示,AST-T5在各种与代码相关的任务中始终优于类似规模的LLMs。结构感知使AST-T5在代码到代码任务中特别强大,在Bugs2Fix任务的精确匹配得分上超过CodeT5 2分,在CodeXGLUE中的Java-C#转译精确匹配得分上超过3分。我们的代码和模型可在https://github.com/gonglinyuan/ast_t5 上公开获取。
English
Large language models (LLMs) have made significant advancements in
code-related tasks, yet many LLMs treat code as simple sequences, neglecting
its structured nature. We introduce AST-T5, a novel pretraining paradigm that
leverages the Abstract Syntax Tree (AST) for enhanced code generation,
transpilation, and understanding. Using dynamic programming, our AST-Aware
Segmentation retains code structure, while our AST-Aware Span Corruption
objective equips the model to reconstruct various code structures. Unlike other
models, AST-T5 avoids intricate program analyses or architectural changes, so
it integrates seamlessly with any encoder-decoder Transformer. Evaluations show
that AST-T5 consistently outperforms similar-sized LMs across various
code-related tasks. Structure-awareness makes AST-T5 particularly powerful in
code-to-code tasks, surpassing CodeT5 by 2 points in exact match score for the
Bugs2Fix task and by 3 points in exact match score for Java-C# Transpilation in
CodeXGLUE. Our code and model are publicly available at
https://github.com/gonglinyuan/ast_t5.