National Repository of Grey Literature 2 records found  Search took 0.00 seconds. 
Active Learning with Neural Networks
Beneš, Štěpán ; Fajčík, Martin (referee) ; Hradiš, Michal (advisor)
The topic of this thesis is the combination  of active learning strategies used in conjunction with deep convolutional networks in image recognition tasks. The goal is to observe the behaviour of selected active learning strategies in a wider array of conditions. The first section of the thesis is dedicated to the theory of active learning, followed by the motivation and challenges of combining them with convolutional neural networks. The goal of this thesis is achieved by a series of experiments, in which the behaviour of active learning strategies is tested for dependencies on the difficulty of the dataset, quality of the learning model, number of training epochs, the size of a batch of samples added in each iteration, the oracle's consistency and the usage of pseudo-labeling technique. The results show the dependency of continuous active learning on the number of training epochs in each iteration and the difficulty of a given dataset. Chosen strategies also seem somewhat resistant to the oracle's faults. The benefits of using pseudo-labeling come hand in hand with the quality of the learning model. Finally, traditional active learning strategies have shown in some cases that they are capable of keeping the pace with modern, tailored strategies.
Active Learning with Neural Networks
Beneš, Štěpán ; Fajčík, Martin (referee) ; Hradiš, Michal (advisor)
The topic of this thesis is the combination  of active learning strategies used in conjunction with deep convolutional networks in image recognition tasks. The goal is to observe the behaviour of selected active learning strategies in a wider array of conditions. The first section of the thesis is dedicated to the theory of active learning, followed by the motivation and challenges of combining them with convolutional neural networks. The goal of this thesis is achieved by a series of experiments, in which the behaviour of active learning strategies is tested for dependencies on the difficulty of the dataset, quality of the learning model, number of training epochs, the size of a batch of samples added in each iteration, the oracle's consistency and the usage of pseudo-labeling technique. The results show the dependency of continuous active learning on the number of training epochs in each iteration and the difficulty of a given dataset. Chosen strategies also seem somewhat resistant to the oracle's faults. The benefits of using pseudo-labeling come hand in hand with the quality of the learning model. Finally, traditional active learning strategies have shown in some cases that they are capable of keeping the pace with modern, tailored strategies.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.