National Repository of Grey Literature 4 records found  Search took 0.00 seconds. 
New Techniques in Neural Networks Training - Connectionist Temporal Classification
Gajdár, Matúš ; Švec, Ján (referee) ; Karafiát, Martin (advisor)
This bachelor’s thesis deals with neural network and their use in speech recognition. Firstly,there is some theory about speech recognition, afterwards we show theory around neural networks in connection with connectionist temporal classification method. In next chapter we introduce toolkits, which were used for training of neural networks and also experiments done by them to find out impact of connectionist temporal classification method on precisionin phoneme decoding. The last chapter include summarization of work and overall evaluation of experiments.
Low-Dimensional Matrix Factorization in End-To-End Speech Recognition Systems
Gajdár, Matúš ; Grézl, František (referee) ; Karafiát, Martin (advisor)
The project covers automatic speech recognition with neural network training using low-dimensional matrix factorization. We are describing time delay neural networks with factorization (TDNN-F) and without it (TDNN) in Pytorch language. We are comparing the implementation between Pytorch and Kaldi toolkit, where we achieve similar results during experiments with various network architectures. The last chapter describes the impact of a low-dimensional matrix factorization on End-to-End speech recognition systems and also a modification of the system with TDNN(-F) networks. Using specific network settings, we were able to achieve better results with systems using factorization. Additionally, we reduced the complexity of training by decreasing network parameters with the use of TDNN(-F) networks.
Low-Dimensional Matrix Factorization in End-To-End Speech Recognition Systems
Gajdár, Matúš ; Grézl, František (referee) ; Karafiát, Martin (advisor)
The project covers automatic speech recognition with neural network training using low-dimensional matrix factorization. We are describing time delay neural networks with factorization (TDNN-F) and without it (TDNN) in Pytorch language. We are comparing the implementation between Pytorch and Kaldi toolkit, where we achieve similar results during experiments with various network architectures. The last chapter describes the impact of a low-dimensional matrix factorization on End-to-End speech recognition systems and also a modification of the system with TDNN(-F) networks. Using specific network settings, we were able to achieve better results with systems using factorization. Additionally, we reduced the complexity of training by decreasing network parameters with the use of TDNN(-F) networks.
New Techniques in Neural Networks Training - Connectionist Temporal Classification
Gajdár, Matúš ; Švec, Ján (referee) ; Karafiát, Martin (advisor)
This bachelor’s thesis deals with neural network and their use in speech recognition. Firstly,there is some theory about speech recognition, afterwards we show theory around neural networks in connection with connectionist temporal classification method. In next chapter we introduce toolkits, which were used for training of neural networks and also experiments done by them to find out impact of connectionist temporal classification method on precisionin phoneme decoding. The last chapter include summarization of work and overall evaluation of experiments.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.