About
I am a PhD student at the University of Sheffield, UK. My supervisors are Prof. Nikos Aletras and Prof. Aline Villavicencio. Previously, I was a full-time NLP researcher at Hitachi, Ltd. in Japan. Before that, I was a master's student at the University of Sheffield, where I obtained an MSc. in Computer Science with Speech and Language Processing (with Distinction).
I have +5 years of academic and industrial experience in NLP with a track record of publications at top-tier NLP conferences, including EMNLP, ACL, and EACL.
My curriculum vitae (CV) is available here (Updated on Sptember 25th, 2024).
Research Interests
Cross-lingual Transfer, Language Modelling, Natural Language Understanding, Natural Language Processing, Machine Learning
Recent Work
-
Cross-lingual transfer for efficient generative large language model inference
The aim of this research is to explore inference-efficient methods for generative large language models under cross-lingual settings.- Atsuki Yamaguchi, Aline Villavicencio and Nikolaos Aletras, “How Can We Effectively Expand the Vocabulary of LLMs with 0.01GB of Target Language Text?,” arXiv preprint: 2406.11477, June 2024. (Last updated in September 2024)
- Atsuki Yamaguchi, Aline Villavicencio and Nikolaos Aletras, “An Empirical Study on Cross-lingual Vocabulary Adaptation for Efficient Language Model Inference,” Findings of the Association for Computational Linguistics: EMNLP 2024, November 2024.
-
Explore simple pretraining alternatives for Transformer-based language representaion models
The aim of this research is to investigate simple yet effective pretraining objectives.- Atsuki Yamaguchi, Hiroaki Ozaki, Terufumi Morishita, Gaku Morio and Yasuhiro Sogawa, “How does the task complexity of masked pretraining objectives affect downstream performance?,” Findings of the Association for Computational Linguistics: ACL 2023, July 2023. (Short paper)
- Atsuki Yamaguchi, George Chrysostomou, Katerina Margatina and Nikolaos Aletras, “Frustratingly Simple Pretraining Alternatives to Masked Language Modeling,” The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021), Online, November 2021. (Short paper)
Links
Contact
You can send me messages via this contact form.