Effects of Layer Freezing when Transferring DeepSpeech to New LanguagesDownload PDFOpen Website

2021 (modified: 14 Jan 2022)CoRR 2021Readers: Everyone
Abstract: In this paper, we investigate the effect of layer freezing on the effectiveness of model transfer in the area of automatic speech recognition. We experiment with Mozilla's DeepSpeech architecture on German and Swiss German speech datasets and compare the results of either training from scratch vs. transferring a pre-trained model. We compare different layer freezing schemes and find that even freezing only one layer already significantly improves results.
0 Replies

Loading