Cross/Multi-Lingual and Code-Switched Speech Recognition

Mon-3-1-7 Improving Code-switching Language Modeling with Artificially Generated Texts using Cycle-consistent Adversarial Networks

Chia-Yu Li(Institute of Natural Language Processing, University of Stuttgart, Germany) and Ngoc Thang Vu(University of Stuttgart)
Abstract: This paper presents our latest effort on improving Code-switching language models that suffer from data scarcity. We investigate methods to augment Code-switching training text data by artificially generating them. Concretely, we propose a cycle-consistent adversarial networks based framework to transfer monolingual text into Code-switching text, considering Code-switching as a speaking style. Our experimental results on the SEAME corpus reveal that utilizing artificially generated Code-switching text data improves consistently the language model as well as the automatic speech recognition performance.
Student Information

Student Events

Travel Grants