Investigating the representation of backchannels and fillers in fine-tuned language models
Y. Wang, L. Lao, L. Huang, G. Skantze, Y. Xu, H. Buschmeier, in: n.d.
Download (ext.)
Conference Paper
| Accepted
| English
Author
Wang, Yu;
Lao, Leyi;
Huang, Langchu;
Skantze, Gabriel;
Xu, Yang;
Buschmeier, HendrikLibreCat 
Department
Abstract
Backchannels and fillers are important linguistic expressions in dialogue, but often treated as ‘noise’ to be bypassed in modern transformer-based language models. Our work studies the representation of them in language models using three fine-tuning strategies. The models are trained on three dialogue corpora in English and Japanese, where backchannels and fillers are preserved and annotated, to investigate how fine-tuning can help LMs learn their representations. We first apply clustering analysis to the learnt representation of backchannels and fillers, and have found increased silhouette scores in representations from fine-tuned models, which suggests that fine-tuning enables LMs to distinguish the nuanced semantic variation in different backchannel and filler use. We also use natural language generation (NLG) metrics and qualitative analysis to confirm that the utterances generated by fine-tuned language models resemble human-produced utterances more closely. Our findings suggest the potentials of transforming general LMs into conversational LMs that are more capable of producing human-like languages adequately.
Publishing Year
Conference
64th Annual Meeting of the Association for Computational Linguistics
Conference Location
San Diego, CA, USA
Conference Date
2026-07-02 – 2026-07-07
LibreCat-ID
Cite this
Wang Y, Lao L, Huang L, Skantze G, Xu Y, Buschmeier H. Investigating the representation of backchannels and fillers in fine-tuned language models.
Wang, Y., Lao, L., Huang, L., Skantze, G., Xu, Y., & Buschmeier, H. (n.d.). Investigating the representation of backchannels and fillers in fine-tuned language models. 64th Annual Meeting of the Association for Computational Linguistics, San Diego, CA, USA.
@inproceedings{Wang_Lao_Huang_Skantze_Xu_Buschmeier, title={Investigating the representation of backchannels and fillers in fine-tuned language models}, author={Wang, Yu and Lao, Leyi and Huang, Langchu and Skantze, Gabriel and Xu, Yang and Buschmeier, Hendrik} }
Wang, Yu, Leyi Lao, Langchu Huang, Gabriel Skantze, Yang Xu, and Hendrik Buschmeier. “Investigating the Representation of Backchannels and Fillers in Fine-Tuned Language Models,” n.d.
Y. Wang, L. Lao, L. Huang, G. Skantze, Y. Xu, and H. Buschmeier, “Investigating the representation of backchannels and fillers in fine-tuned language models,” presented at the 64th Annual Meeting of the Association for Computational Linguistics, San Diego, CA, USA.
Wang, Yu, et al. Investigating the Representation of Backchannels and Fillers in Fine-Tuned Language Models.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level
Closed Access
