Transfer of Structural Knowledge from Synthetic Languages

  • This work explores transfer learning from several synthetic languages to English. We investigate the structure of the embeddings in the finetuned models, the information they contain, and the capabilities of the finetuned models on simple linguistic tasks. We also introduce a new synthetic language that leads to better transfer to English than the languages used in previous research. Finally, we introduce Tiny-Cloze Benchmark — a new synthetic benchmark for natural language understanding that is more informative for less powerful models. We use Tiny-Cloze Benchmark to evaluate fine-tuned models in several domains demonstrating that finetuning on a new synthetic language allows for better performance on a variety of tasks.

Download full text files

Export metadata

Additional Services

Search Google Scholar
Metadaten
Author:Mikhail Budnikov, Ivan Yamshchikov
URN:urn:nbn:de:bvb:863-opus-62797
DOI:https://doi.org/10.18653/v1/2025.xllm-1.20
Publisher:Association for Computational Linguistics
Document Type:Conference Proceeding
Language:English
Year of publication:2025
Release Date:2025/09/23
Tag:LLM pretraining; synthetic languages
Volume:Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025)
First Page:242
Last Page:251
Institutes and faculty:Institute / Center for Artificial Intelligence (CAIRO)
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.