Project Details
Projekt Print View

MeTRapher: Learning to Translate Metaphors

Subject Area Applied Linguistics, Computational Linguistics
Term since 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 530853546
 
The interpretation and translation of metaphors is a challenging task which has received much attention in the cognitive, linguistics and translation communities. The natural language processing (NLP) community provides tools for monolingual metaphor detection and interpretation as well as excellent cross-lingual representations and machine translation systems, but little work addresses metaphors in machine-learning translation. Metaphors are not only very flexible in their structures and meanings; they also strongly depend on the involved languages and cultures. Our long-term vision is to create a modular machine-learning translation system across language pairs that induces and successfully integrates knowledge and conditions for metaphor translation. Starting out with a structural combination that is notoriously renowned for metaphorical language usage (verb–object pairs) and three languages (English, German, Czech), we will first create an extensively annotated cross-lingual database for the language pairs, to specify monolingual metaphor conditions and cross-lingual translation modes, and to create seed training data across domains and registers. We will make a lasting contribution to natural language processing research on metaphors by making our tools and data as well as our annotation guidelines available. A second part will address two important areas of effort in parallel, dealing with the local "micro-context", i.e., contextual features such as syntax-based dependencies and selectional preferences, and the global "macro-context", i.e., discourse conditions such as lexical and topical coherence features in metaphor contexts. For micro-context, we will adapt the standard pipeline for tagging tasks by fine-tuning pretrained language models in a multitasking setup, before adding another layer in the multitask learning framework to include linearised dependency-parse tags to strengthen the role of local syntactic knowledge in the model, and further linguistic knowledge such as selectional preference violation. For macro-context, we will adapt and combine state-of-the-art sentence-level metaphor detection systems, while integrating degrees of discourse coherence regarding lexical-syntactic/-semantic discourse structures, topic coherence, abstractness and affect, in interaction with degrees of metaphor conventionalisation. System combinations will be further enhanced by integrating contextual embedding representations adjusted to larger-scale discourses. Finally, we will address the problem of combining micro-contextual and macro-contextual modeling of metaphors and their integration within state-of-the-art neural machine translation systems, providing a first comprehensive metaphor translation system as a well-defined end-to-end task together with a novel evaluation protocol to evaluate our efforts on the processing of metaphors, and to gain further insight into translation modes and fine-grained cross-lingual conditions.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung