National Repository of Grey Literature 9 records found  Search took 0.00 seconds. 
Gaussian Processes Based Hyper-Optimization of Neural Networks
Coufal, Martin ; Landini, Federico Nicolás (referee) ; Beneš, Karel (advisor)
Cílem této diplomové práce je vytvoření nástroje pro optimalizaci hyper-parametrů umělých neuronových sítí. Tento nástroj musí být schopen optimalizovat více hyper-parametrů, které mohou být navíc i korelovány. Tento problém jsem vyřešil implmentací optimalizátoru, který využívá Gaussovské procesy k predikci vlivu jednotlivých hyperparametrů na výslednou přesnost neuronové sítě. Z provedených experimentů na několika benchmark funkcích jsem zjistil, že implementovaný nástroj je schopen dosáhnout lepších výsledků než optimalizátory založené na náhodném prohledávání a snížit tak v průměru počet potřebných kroků optimalizace. Optimalizace založená na náhodném prohledávání dosáhla lepších výsledků pouze v prvních krocích optimalizace, než si optimalizátor založený na Gaussovských procesech vytvoří dostatečně přesný model problému. Nicméně téměř všechny experimenty provedené na datasetu MNIST prokázaly lepší výsledky optimalizátoru založeného na náhodném prohledávání. Tyto rozdíly v provedených experimentech jsou pravděpodobně dány složitostí zvolených benchmark funkcí nebo zvolenými parametry implementovaného optimalizátoru.
Modern regression methods in data mining
Kopal, Vojtěch ; Holeňa, Martin (advisor) ; Gemrot, Jakub (referee)
The thesis compares several non-linear regression methods on synthetic data sets gen- erated using standard benchmarks for a continuous black-box optimization. For that com- parison, we have chosen the following regression methods: radial basis function networks, Gaussian processes, support vector regression and random forests. We have also included polynomial regression which we use to explain the basic principles of regression. The com- parison of these methods is discussed in the context of black-box optimization problems where the selected methods can be applied as surrogate models. The methods are evalu- ated based on their mean-squared error and on the Kendall's rank correlation coefficient between the ordering of function values according to the model and according to the function used to generate the data. 1
Model-based evolutionary optimization methods
Bajer, Lukáš ; Holeňa, Martin (advisor) ; Brockhoff, Dimo (referee) ; Pošík, Petr (referee)
Model-based black-box optimization is a topic that has been intensively studied both in academia and industry. Especially real-world optimization tasks are often characterized by expensive or time-demanding objective functions for which statistical models can save resources or speed-up the optimization. Each of three parts of the thesis concerns one such model: first, copulas are used instead of a graphical model in estimation of distribution algorithms, second, RBF networks serve as surrogate models in mixed-variable genetic algorithms, and third, Gaussian processes are employed in Bayesian optimization algorithms as a sampling model and in the Covariance matrix adaptation Evolutionary strategy (CMA-ES) as a surrogate model. The last combination, described in the core part of the thesis, resulted in the Doubly trained surrogate CMA-ES (DTS-CMA-ES). This algorithm uses the uncertainty prediction of a Gaussian process for selecting only a part of the CMA-ES population for evaluation with the expensive objective function while the mean prediction is used for the rest. The DTS-CMA-ES improves upon the state-of-the-art surrogate continuous optimizers in several benchmark tests.
Filtering for Stochastic Evolution Equations
Kubelka, Vít ; Maslowski, Bohdan (advisor)
Filtering for Stochastic Evolution Equations Vít Kubelka Doctoral thesis Abstract Linear filtering problem for infinite-dimensional Gaussian processes is studied, the observation process being finite-dimensional. Integral equations for the filter and for covariance of the error are derived. General results are applied to linear SPDEs driven by Gauss-Volterra process observed at finitely many points of the domain and to delayed SPDEs driven by white noise. Subsequently, the continuous dependence of the filter and observation error on parameters which may be present both in the signal and the obser- vation process is proved. These results are applied to signals governed by stochastic heat equations driven by distributed or pointwise fractional noise. The observation process may be a noisy observation of the signal at given points in the domain, the position of which may depend on the parameter. 1
Filtering for Stochastic Evolution Equations
Kubelka, Vít ; Maslowski, Bohdan (advisor)
Filtering for Stochastic Evolution Equations Vít Kubelka Doctoral thesis Abstract Linear filtering problem for infinite-dimensional Gaussian processes is studied, the observation process being finite-dimensional. Integral equations for the filter and for covariance of the error are derived. General results are applied to linear SPDEs driven by Gauss-Volterra process observed at finitely many points of the domain and to delayed SPDEs driven by white noise. Subsequently, the continuous dependence of the filter and observation error on parameters which may be present both in the signal and the obser- vation process is proved. These results are applied to signals governed by stochastic heat equations driven by distributed or pointwise fractional noise. The observation process may be a noisy observation of the signal at given points in the domain, the position of which may depend on the parameter. 1
Filtering for Stochastic Evolution Equations
Kubelka, Vít ; Maslowski, Bohdan (advisor) ; Tudor, Ciprian (referee) ; Klebanov, Lev (referee)
Filtering for Stochastic Evolution Equations Vít Kubelka Doctoral thesis Abstract Linear filtering problem for infinite-dimensional Gaussian processes is studied, the observation process being finite-dimensional. Integral equations for the filter and for covariance of the error are derived. General results are applied to linear SPDEs driven by Gauss-Volterra process observed at finitely many points of the domain and to delayed SPDEs driven by white noise. Subsequently, the continuous dependence of the filter and observation error on parameters which may be present both in the signal and the obser- vation process is proved. These results are applied to signals governed by stochastic heat equations driven by distributed or pointwise fractional noise. The observation process may be a noisy observation of the signal at given points in the domain, the position of which may depend on the parameter. 1
Gaussian Processes Based Hyper-Optimization of Neural Networks
Coufal, Martin ; Landini, Federico Nicolás (referee) ; Beneš, Karel (advisor)
Cílem této diplomové práce je vytvoření nástroje pro optimalizaci hyper-parametrů umělých neuronových sítí. Tento nástroj musí být schopen optimalizovat více hyper-parametrů, které mohou být navíc i korelovány. Tento problém jsem vyřešil implmentací optimalizátoru, který využívá Gaussovské procesy k predikci vlivu jednotlivých hyperparametrů na výslednou přesnost neuronové sítě. Z provedených experimentů na několika benchmark funkcích jsem zjistil, že implementovaný nástroj je schopen dosáhnout lepších výsledků než optimalizátory založené na náhodném prohledávání a snížit tak v průměru počet potřebných kroků optimalizace. Optimalizace založená na náhodném prohledávání dosáhla lepších výsledků pouze v prvních krocích optimalizace, než si optimalizátor založený na Gaussovských procesech vytvoří dostatečně přesný model problému. Nicméně téměř všechny experimenty provedené na datasetu MNIST prokázaly lepší výsledky optimalizátoru založeného na náhodném prohledávání. Tyto rozdíly v provedených experimentech jsou pravděpodobně dány složitostí zvolených benchmark funkcí nebo zvolenými parametry implementovaného optimalizátoru.
Model-based evolutionary optimization methods
Bajer, Lukáš ; Holeňa, Martin (advisor) ; Brockhoff, Dimo (referee) ; Pošík, Petr (referee)
Model-based black-box optimization is a topic that has been intensively studied both in academia and industry. Especially real-world optimization tasks are often characterized by expensive or time-demanding objective functions for which statistical models can save resources or speed-up the optimization. Each of three parts of the thesis concerns one such model: first, copulas are used instead of a graphical model in estimation of distribution algorithms, second, RBF networks serve as surrogate models in mixed-variable genetic algorithms, and third, Gaussian processes are employed in Bayesian optimization algorithms as a sampling model and in the Covariance matrix adaptation Evolutionary strategy (CMA-ES) as a surrogate model. The last combination, described in the core part of the thesis, resulted in the Doubly trained surrogate CMA-ES (DTS-CMA-ES). This algorithm uses the uncertainty prediction of a Gaussian process for selecting only a part of the CMA-ES population for evaluation with the expensive objective function while the mean prediction is used for the rest. The DTS-CMA-ES improves upon the state-of-the-art surrogate continuous optimizers in several benchmark tests.
Modern regression methods in data mining
Kopal, Vojtěch ; Holeňa, Martin (advisor) ; Gemrot, Jakub (referee)
The thesis compares several non-linear regression methods on synthetic data sets gen- erated using standard benchmarks for a continuous black-box optimization. For that com- parison, we have chosen the following regression methods: radial basis function networks, Gaussian processes, support vector regression and random forests. We have also included polynomial regression which we use to explain the basic principles of regression. The com- parison of these methods is discussed in the context of black-box optimization problems where the selected methods can be applied as surrogate models. The methods are evalu- ated based on their mean-squared error and on the Kendall's rank correlation coefficient between the ordering of function values according to the model and according to the function used to generate the data. 1

Interested in being notified about new results for this query?
Subscribe to the RSS feed.