Národní úložiště šedé literatury Nalezeno 3 záznamů.  Hledání trvalo 0.02 vteřin. 
Semi-Supervised Speech-to-Text Recognition with Text-to-Speech Critic
Baskar, Murali Karthick ; Manohar, Vimal (oponent) ; Trmal, Jan (oponent) ; Burget, Lukáš (vedoucí práce)
Sequence-to-sequence automatic speech recognition (ASR) models require large quantities of training data to attain good performance. For this reason, unsupervised and semi-supervised training in seq2seq models have recently witnessed a surge in interest. This work builds upon recent results showing notable improvements in semi-supervised training using cycle-consistency and related techniques. Such techniques derive training procedures and losses able to leverage unpaired speech and/or text data by combining ASR with text-to-speech (TTS) models. This thesis first proposes a new semi-supervised modelling framework combining an end-to-end differentiable ASR->TTS loss with TTS->ASR loss. The method is able to leverage unpaired speech and text data to outperform recently proposed related techniques in terms of word error rate (WER). We provide extensive results analysing the impact of data quantity as well as the contribution of speech and text modalities in recovering errors and show consistent gains across WSJ and LibriSpeech corpora. The thesis also discusses the limitations of the ASR<->TTS model in out-of-domain data conditions. We propose an enhanced ASR<->TTS (EAT) model incorporating two main features: 1) the ASR->TTS pipeline is equipped with a language model reward to penalize the ASR hypotheses before forwarding them to TTS; and 2) speech regularizer trained in unsupervised fashion is introduced in TTS->ASR to correct the synthesized speech before sending it to the ASR model. Training strategies and the effectiveness of the EAT model are explored and compared with augmentation approaches. The results show that EAT reduces the performance gap between supervised and semi-supervised training by absolute WER improvement of 2.6% and 2.7% on LibriSpeech and BABEL respectively.

Chcete být upozorněni, pokud se objeví nové záznamy odpovídající tomuto dotazu?
Přihlásit se k odběru RSS.