National Repository of Grey Literature 3 records found  Search took 0.01 seconds. 
Neural Network Letter Recognition
Kluknavský, František ; Hradiš, Michal (referee) ; Šilhavá, Jana (advisor)
This work uses handwritten character recognition as a model problem for using multilayer perceptron, error backpropagation learning algorithm and finding their optimal parameters, hidden layer size, learning rate and length, ability to handle damaged data. Results were acquired by repeated simulation and testing the neural network using 52,152 English lowercase letters. Best results, smallest network and shortest learning time was at 60 neurons in the hidden layer and learning rate of 0.01. Bigger networks achieved the same ability to recognize unknown patterns and higher robustness at highly damaged data processing.
Music Source Separation
Holík, Viliam ; Veselý, Karel (referee) ; Mošner, Ladislav (advisor)
Neural networks are used for the problem of music source separation from recordings. One such network is Conv-TasNet. The aim of the work is to experiment with the already existing implementation of this network for the purpose of potential improvement. The models were trained on the MUSDB18 dataset. It was successively experimented with the change of the network structure, transforming signals from the time domain to the frequency domain for the purpose of calculating the loss function, replacing different loss functions with the original one, finding the optimal learning rate for each loss function and gradually decreasing the learning rate during the learning process. The best experiments according to the SDR metric were training with loss functions L1 and logarithmic L2 in the time domain with a higher initial learning rate with its gradual decrease during the learning process. In a relative comparison of the best models to the baseline, it is more than 2.5% improvement.
Neural Network Letter Recognition
Kluknavský, František ; Hradiš, Michal (referee) ; Šilhavá, Jana (advisor)
This work uses handwritten character recognition as a model problem for using multilayer perceptron, error backpropagation learning algorithm and finding their optimal parameters, hidden layer size, learning rate and length, ability to handle damaged data. Results were acquired by repeated simulation and testing the neural network using 52,152 English lowercase letters. Best results, smallest network and shortest learning time was at 60 neurons in the hidden layer and learning rate of 0.01. Bigger networks achieved the same ability to recognize unknown patterns and higher robustness at highly damaged data processing.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.