National Repository of Grey Literature 7 records found  Search took 0.01 seconds. 
Meta-Parameters of Kernel Methods and Their Optimization
Vidnerová, Petra ; Neruda, Roman
In this work we deal with the problem of metalearning for kernel based methods. Among the kernel methods we focus on the support vector machine (SVM), that have become a method of choice in a wide range of practical applications, and on the regularization network (RN) with a sound background in approximation theory. We discuss the role of kernel function in learning, and we explain several search methods for kernel function optimization, including grid search, genetic search and simulated annealing. The proposed methodology is demonstrated on experiments using benchmark data sets.
Representations of Boolean Functions by Perceptron Networks
Kůrková, Věra
Limitations of capabilities of shallow perceptron networks are investigated. Lower bounds are derived for growth of numbers of units and sizes of output weights in networks representing Boolean functions of d variables. It is shown that for large d, almost any randomly chosen Boolean function cannot be tractably represented by shallow perceptron networks, i.e., each its representation requires a network with number of units or sizes of output weights depending on d exponentially
Kernel density estimates in particle filter
Coufal, David
Fulltext: content.csg - Download fulltextPDF
Plný tet: v1210-14 - Download fulltextPDF
Nature inspired search algorithms and their applications
Neruda, Roman
Basic principles of evolutionary algorithms and genetic search of parameter spaces are described in this paper. We explain the approaches common for genetic algorithms, evolutionary strategies, evolutionary programming, genetic programming, swarm algorithms, and neuroevolution. Published in proceedings Analýza dat 2013. Statistické metody pro technologii a výzkum. Pardubice : TriloByte Statistical Software, 2013, p. 69-80. ISSN 1805-6903. Presented as invited talk at the conference Analýza dat 2013.
Capabilities of Radial and Kernel Networks
Kůrková, Věra
Originally, artificial neural networks were built from biologically inspired units called perceptrons. Later, other types of units became popular in neurocomputing due to their good mathematical properties. Among them, radial-basis-function (RBF) units and kernel units became most popular. The talk will discuss advantages and limitations of networks with these two types of computational units. Higher flexibility in choice of free parameters in RBF will be compared with benefits of geometrical properties of kernel models allowing applications of maximal margin classification algorithms, modelling of generalization in learning from data in terms of regularization, and characterization of optimal solutions of learning tasks. Critical influence of input dimension on behavior of these two types of networks will be described. General results will be illustrated by the paradigmatic examples of Gaussian kernel and radial networks.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.