National Repository of Grey Literature 39 records found  beginprevious20 - 29next  jump to record: Search took 0.02 seconds. 
Regularization of Illegal Migration as a Policy-making Process
Dumont, Anna ; Novotný, Vilém (advisor) ; Jelínková, Marie (referee)
'Regularization of Illegal Migration as a Policy-making Process' deals with the social problem of having a high number of irregular migrants in the Czech Republic and regularization as a tool that could help reduce it. Regularization is seen as a political process theoretically described by using the Advocacy Coalition Framework. This thesis tries to find normative definitions of the two coalitions, which hold different beliefs and two different points of view rather than describe the problem. The work is partly designed as a case study in which the theory is applied to the issue of regularization in part there is also an explanation of regularization as well as the Advocacy Coalition Framework. The thesis defines the two coalitions within the subsystem - the for-regularization and anti-regularization coalitions. Each coalition has its deep core and policy core beliefs that determine the relationship to the topic as well as the relationship between the coalitions themselves. In conclusion, the author summarized the information about the coalitions and their belief in three comparative tables where one can confront their approaches. The last part also contains a chapter on The Changes of Beliefs and Policies, where there is an introduction of two policies: the system of voluntary return and that...
Numerické metody zpracování obrazu
Tóthová, Katarína ; Hnětynková, Iveta (advisor) ; Zítko, Jan (referee)
The aim of this thesis is to provide a concise overview of the numerical techniques in digital image processing, specifically to discuss the construction, properties and methods of solving of the image deblurring problems modelled by a linear system Ax = b. Often, these problems fall within a group of the ill-posed problems with severely ill-conditioned matrix A and hence require special numerical treatment. We provide a brief overview of selected regularization methods that can be used in this situation, including direct (TSVD, Tikhonov regularization) and iterative ones (CGLS, LSQR), together with the pertinent parameter-choice methods - L-curve, GCV and the discrepancy principle. The theoretical discussion is supplemented by the numerical experiments with real-life image data.
Lineární algebraické modelování úloh s nepřesnými daty
Vasilík, Kamil ; Hnětynková, Iveta (advisor) ; Janovský, Vladimír (referee)
In this thesis we consider problems Ax b arising from the discretization of ill-posed problems, where the right-hand side b is polluted by (unknown) noise. It was shown in [29] that under some natural assumptions, using the Golub-Kahan iterative bidiagonalization the noise level in the data can be estimated at a negligible cost. Such information can be further used in solving ill-posed problems. Here we suggest criteria for detecting the noise revealing iteration in the Golub-Kahan iterative bidiagonalization. We discuss the presence of noise of different colors. We study how the loss of orthogonality affects the noise revealing property of the bidiagonalization.
Robust Regularized Discriminant Analysis Based on Implicit Weighting
Kalina, Jan ; Hlinka, Jaroslav
In bioinformatics, regularized linear discriminant analysis is commonly used as a tool for supervised classification problems tailormade for high-dimensional data with the number of variables exceeding the number of observations. However, its various available versions are too vulnerable to the presence of outlying measurements in the data. In this paper, we exploit principles of robust statistics to propose new versions of regularized linear discriminant analysis suitable for highdimensional data contaminated by (more or less) severe outliers. The work exploits a regularized version of the minimum weighted covariance determinant estimator, which is one of highly robust estimators of multivariate location and scatter. The performance of the novel classification methods is illustrated on real data sets with a detailed analysis of data from brain activity research.
Fulltext: content.csg - Download fulltextPDF
Plný tet: v1241-16 - Download fulltextPDF
Some Robust Distances for Multivariate Data
Kalina, Jan ; Peštová, Barbora
Numerous methods of multivariate statistics and data mining suffer from the presence of outlying measurements in the data. This paper presents new distance measures suitable for continuous data. First, we consider a Mahalanobis distance suitable for high-dimensional data with the number of variables (largely) exceeding the number of observations. We propose its doubly regularized version, which combines a regularization of the covariance matrix with replacing the means of multivariate data by their regularized counterparts. We formulate explicit expressions for some versions of the regularization of the means, which can be interpreted as a denoising (i.e. robust version) of standard means. Further, we propose a robust cosine similarity measure, which is based on implicit weighting of individual observations. We derive properties of the newly proposed robust cosine similarity, which includes a proof of the high robustness in terms of the breakdown point.
Implementation of restoring method for reading bar code
Kadlčík, Libor ; Bartušek, Karel (referee) ; Mikulka, Jan (advisor)
Bar code stores information in the form of series of bars and gaps with various widths, and therefore can be considered as an example of bilevel (square) signal. Magnetic bar codes are created by applying slightly ferromagnetic material to a substrate. Sensing is done by reading oscillator, whose frequency is modulated by presence of the mentioned ferromagnetic material. Signal from the oscillator is then subjected to frequency demodulation. Due to temperature drift of the reading oscillator, the demodulated signal is accompanied by DC drift. Method for removal of the drift is introduced. Also, drift-insensitive detection of presence of a bar code is described. Reading bar codes is complicated by convolutional distortion, which is result of spatially dispersed sensitivity of the sensor. Effect of the convolutional distortion is analogous to low-pass filtering, causing edges to be smoothed and overlapped, and making their detection difficult. Characteristics of convolutional distortion can be summarized into point-spread function (PSF). In case of magnetic bar codes, the shape of the PSF can be known in advance, but not its width of DC transfer. Methods for estimation of these parameters are discussed. The signal needs to be reconstructed (into original bilevel form) before decoding can take place. Variational methods provide effective way. Their core idea is to reformulate reconstruction as an optimization problem of functional minimization. The functional can be extended by other functionals (regularizations) in order to considerably improve results of reconstruction. Principle of variational methods will be shown, including examples of use of various regularizations. All algorithm and methods (including frequency demodulation of signal from reading oscillator) are digital. They are implemented as a program for a microcontroller from the PIC32 family, which offers high computing power, so that even blind deconvolution (when the real PSF also needs to be found) can be finished in a few seconds. The microcontroller is part of magnetic bar code reader, whose hardware allows the read information to be transferred to personal computer via the PS/2 interface or USB (by emulating key presses on virtual keyboard), or shown on display.
Napoleon´s theorem
MRÁZ, Luděk
The target of the this diploma thesis called ''The Napoleon's theorem'' is a detailed concentration on this theorem, where the process of so called ''regularization'' is described. Under the investigation of the Napoleon's theorem this diploma thesis is concerned with a lot of proofs, properties and then their generalization in a plane and in space. Pictures, which can help the reader to understand this problem are supplemented in this diploma thesis.
Economics of Biased Estimation
Drvoštěp, Tomáš ; Špecián, Petr (advisor) ; Tříska, Dušan (referee)
This thesis investigates optimality of heuristic forecasting. According to Goldstein a Gigerenzer (2009), heuristics can be viewed as predictive models, whose simplicity is exploiting the bias-variance trade-off. Economic agents learning in the context of rational expectations (Marcet a Sargent 1989) employ, on the contrary, complex models of the whole economy. Both of these approaches can be perceived as an optimal response complexity of the prediction task and availability of observations. This work introduces a straightforward extension to the standard model of decision making under uncertainty, where agents utility depends on accuracy of their predictions and where model complexity is moderated by regularization parameter. Results of Monte Carlo simulations reveal that in complicated environments, where few observations are at disposal, it is beneficial to construct simple models resembling heuristics. Unbiased models are preferred in more convenient conditions.
Robust Regularized Cluster Analysis for High-Dimensional Data
Kalina, Jan ; Vlčková, Katarína
This paper presents new approaches to the hierarchical agglomerative cluster analysis for high-dimensional data. First, we propose a regularized version of the hierarchical cluster analysis for categorical data with a large number of categories. It exploits a regularized version of various test statistics of homogeneity in contingency tables as the measure of distance between two clusters. Further, our aim is cluster analysis of continuous data with a large number of variables. Various regularization techniques tailor-made for high-dimensional data have been proposed, which have however turned out to suffer from a high sensitivity to the presence of outlying measurements in the data. As a robust solution, we recommend to combine two newly proposed methods, namely a regularized version of robust principal component analysis and a regularized Mahalanobis distance, which is based on an asymptotically optimal regularization of the covariance matrix. We bring arguments in favor of the newly proposed methods.

National Repository of Grey Literature : 39 records found   beginprevious20 - 29next  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.