ChatPaper.aiChatPaper

PhiloBERTA:基於Transformer的希臘與拉丁語詞彙跨語言分析

PhiloBERTA: A Transformer-Based Cross-Lingual Analysis of Greek and Latin Lexicons

March 7, 2025
作者: Rumi A. Allbert, Makai L. Allbert
cs.AI

摘要

我們提出PhiloBERTA,這是一個跨語言的Transformer模型,用於衡量古希臘語與拉丁語詞彙之間的語義關係。透過分析古典文本中的特定詞對,我們利用上下文嵌入和角度相似性度量來識別精確的語義對齊。我們的結果顯示,詞源相關的詞對展現出顯著更高的相似性分數,尤其是在抽象哲學概念如epist\=em\=e(scientia)和dikaiosyn\=e(iustitia)方面。統計分析揭示了這些關係中的一致性模式(p = 0.012),與對照組相比,詞源相關的詞對展現出極為穩定的語義保留。這些發現為研究哲學概念如何在希臘與拉丁傳統之間遷移建立了一個量化框架,為古典語文學研究提供了新的方法。
English
We present PhiloBERTA, a cross-lingual transformer model that measures semantic relationships between ancient Greek and Latin lexicons. Through analysis of selected term pairs from classical texts, we use contextual embeddings and angular similarity metrics to identify precise semantic alignments. Our results show that etymologically related pairs demonstrate significantly higher similarity scores, particularly for abstract philosophical concepts such as epist\=em\=e (scientia) and dikaiosyn\=e (iustitia). Statistical analysis reveals consistent patterns in these relationships (p = 0.012), with etymologically related pairs showing remarkably stable semantic preservation compared to control pairs. These findings establish a quantitative framework for examining how philosophical concepts moved between Greek and Latin traditions, offering new methods for classical philological research.

Summary

AI-Generated Summary

PDF22March 11, 2025