ChatPaper.aiChatPaper

時序增強的圖注意力網絡在可供性分類中的應用

A Temporally Augmented Graph Attention Network for Affordance Classification

April 11, 2026
作者: Ami Chopra, Supriya Bordoloi, Shyamanta M. Hazarika
cs.AI

摘要

圖注意力網絡(GATs)為關係數據中的節點表徵學習提供了最佳框架之一;然而,現有變體如圖注意力網絡(GAT)主要作用於靜態圖,在應用於序列數據時依賴隱性的時間聚合。本文提出腦電圖時序圖注意力網絡(EEG-tGAT),這是針對互動序列中可供性分類任務而設計的GATv2時序增強架構。該模型融合了時間注意力機制以調控不同時間段的貢獻度,並採用時序丟棄法來規範化時間相關觀測值的學習過程。此設計基於以下假設:可供性數據中的時間維度在語義上並非均質,且判別性信息可能非均勻分佈於時間軸上。在可供性數據集上的實驗結果表明,EEG-tGAT相比GATv2實現了分類性能的提升。觀察到的增益佐證了以下結論:顯式編碼時間重要性並強化時序魯棒性所引入的歸納偏置,能更精準契合可供性驅動互動數據的結構特性。這些發現表明,當時間關係在任務中具有重要作用時,對圖注意力模型進行適度的架構調整可帶來持續性收益。
English
Graph attention networks (GATs) provide one of the best frameworks for learning node representations in relational data; but, existing variants such as Graph Attention Network (GAT) mainly operate on static graphs and rely on implicit temporal aggregation when applied to sequential data. In this paper, we introduce Electroencephalography-temporal Graph Attention Network (EEG-tGAT), a temporally augmented formulation of GATv2 that is tailored for affordance classification from interaction sequences. The proposed model incorporates temporal attention to modulate the contribution of different time segments and temporal dropout to regularize learning across temporally correlated observations. The design reflects the assumption that temporal dimensions in affordance data are not semantically uniform and that discriminative information may be unevenly distributed across time. Experimental results on affordance datasets show that EEG-tGAT achieves improved classification performance compared to GATv2. The observed gains helps to conclude that explicitly encoding temporal importance and enforcing temporal robustness introduce inductive biases that are much better aligned with the structure of affordance-driven interaction data. These findings show us that modest architectural changes to graph attention models can help one obtain consistent benefits when temporal relationships play a nontrivial role in the task.
PDF02April 17, 2026