National Repository of Grey Literature 26 records found  1 - 10nextend  jump to record: Search took 0.00 seconds. 
Increasing Resolution in Perfusion Magnetic Resonance Imaging Using Compressed Sensing
Mangová, Marie ; Polec,, Jaroslav (referee) ; Šmídl, Václav (referee) ; Rajmic, Pavel (advisor)
Perfusion magnetic resonance imaging is a medical diagnostic method which requires high spatial and temporal resolution simultaneously to capture dynamics of an intravenous contrast agent which is used to perfusion measurement. However, magnetic resonance imaging has physical limits which do not allow to have this resolution simultaneously. This thesis deals with compressed sensing which enables to reconstruct measured data from relatively few acquired samples (below Nyquist rate) while resolution required to perfusion analysis is increased. This aim could be achieved with suitably proposed apriory information about sensed data and model proposal. The reconstruction is then done as an optimization problem. Doctoral thesis brings several new reconstruction models, further proposes method to debias this estimates and examines influence of compressed sensing onto perfusion parameters. Whole thesis is ended with extension of compressed sensing into three-dimensional data. Here, the influence of reconstruction onto perfusion parameters is also described. In summary, the thesis shows that due to compressed sensing, temporal resolution can be increased with the fixed spatial resolution or spatial resolution can be increased with the fixed temporal resolution.
Exploiting Uncertainty Information in Speaker Verification and Diarization
Silnova, Anna ; Šmídl, Václav (referee) ; Villalba Lopez, Jesus Antonio (referee) ; Burget, Lukáš (advisor)
Tato práce se zabývá dvěma modely, které umožňují využít informace o nejistotě v úlohách automatického ověřování mluvčího a diarizace mluvčích. První model, který zvažujeme, je modifikací široce používané gaussovské pravděpodobnostní lineární diskriminační analýzy (G-PLDA), modelující rozložení vektorových reprezentací promluv nazývaných embeddingy. V G-PLDA se předpokládá, že embeddingy jsou generovány přidáním šumového vektoru navzorkovaného z Gaussova rozložení k vektoru reprezentujícímu mluvčího. Ukazujeme, že za předpokladu, že šum byl místo toho vzorkován ze Studentova T-rozdělení, model PLDA (tuto verzi nazýváme PLDA s těžkým chvostem, heavy-tail, HT-PLDA) může při rozhodnutí o ověření mluvčího využít informace o nejistotě. Náš model je koncepčně podobný modelu HT-PLDA definovanému Kennym et al. v roce 2010, ale jak ukazujeme v této práci, umožňuje rychlé skórování, zatímco původní definice HT-PLDA je značně časové a výpočetně náročná. Představujeme algoritmus pro trénování naší verze HT-PLDA jako generativního modelu a zvažujeme rovněž různé strategie diskriminativního trénování parametrů tohoto modelu. Generativně a diskriminativně trénovanou HT-PLDA testujeme na úloze ověřování mluvčího. Výsledky naznačují, že HT-PLDA funguje podobně jako standardní G-PLDA, přičemž má výhodu v odolnosti vůči změnám v předzpracování dat. Experimenty s diarizací mluvčích ukazují, že HT-PLDA poskytuje nejen lepší výsledky než základní G-PLDA, ale skóre logaritmického poměru věrohodností (log-likelihood ratio, LLR) produkovaná tímto modelem jsou lépe kalibrována. Ve druhém modelu nepovažujeme (na rozdíl od HT-PLDA) embeddingy za pozorovaná data. Místo toho jsou v tomto modelu embeddingy normálně rozložené skryté proměnné. Přesnost (precision) embeddingů nese informaci o kvalitě řečového segmentu: u čistých dlouhých segmentů by přesnost měla být vysoká a u krátkých a zašuměných promluv by měla být nízká. Ukazujeme, jak lze takové pravděpodobnostní embeddingy začlenit do skórování založeného na G-PLDA, a jak parametry skrytého embeddingu ovlivňují jeho vliv při výpočtu věrohodností s tímto modelem. V experimentech demonstrujeme, jak lze využít existující extraktor embeddingů založený na neuronové síti (NN) k produkci nikoli embeddingu, ale parametrů pravděpodobnostního rozložení embeddingu. Pravděpodobnostní embeddingy testujeme na úloze diarizace mluvčích. Výsledky ukazují, že tento model poskytuje dobře kalibrovaná skóre LLR umožňující lepší diarizaci, pokud není k dispozici vývojová datová sada pro ladění shlukovacího algoritmu.
Software environment for data assimilation in radiation protection
Majer, Peter ; Šmídl, Václav (advisor) ; Hofman, Radek (referee)
In this work we apply data assimilation onto meteorological model WRF for local domain. We use bayesian statistics, namely Sequential Monte Carlo method combined with particle filtering. Only surface wind data are considered. An application written in Python programming language is also part of this work. This application forms interface with WRF, performs data assimilation and provides set of charts as output of data assimilation. In case of stable wind conditions, wind predictions of assimilated WRF are significantly closer to measured data than predictions of non-assimilated WRF. In this kind of conditions, this assimilated model can be used for more accurate short-term local weather predictions. Powered by TCPDF (www.tcpdf.org)
DEnFi: Deep Ensemble Filter for Active Learning
Ulrych, Lukáš ; Šmídl, Václav
Deep Ensembles proved to be a one of the most accurate representation of uncertainty for deep neural networks. Their accuracy is beneficial in the task of active learning where unknown samples are selected for labeling based on the uncertainty of their prediction. Underestimation of the predictive uncertainty leads to poor exploration of the method. The main issue of deep ensembles is their computational cost since multiple complex networks have to be computed in parallel. In this paper, we propose to address this issue by taking advantage of the recursive nature of active learning. Specifically, we propose several methods how to generate initial values of an ensemble based of the previous ensemble. We provide comparison of the proposed strategies with existing methods on benchmark problems from Bayesian optimization and active classification. Practical benefits of the approach is demonstrated on example of learning ID of an IoT device from structured data using deep-set based networks.
Increasing Resolution in Perfusion Magnetic Resonance Imaging Using Compressed Sensing
Mangová, Marie ; Polec,, Jaroslav (referee) ; Šmídl, Václav (referee) ; Rajmic, Pavel (advisor)
Perfusion magnetic resonance imaging is a medical diagnostic method which requires high spatial and temporal resolution simultaneously to capture dynamics of an intravenous contrast agent which is used to perfusion measurement. However, magnetic resonance imaging has physical limits which do not allow to have this resolution simultaneously. This thesis deals with compressed sensing which enables to reconstruct measured data from relatively few acquired samples (below Nyquist rate) while resolution required to perfusion analysis is increased. This aim could be achieved with suitably proposed apriory information about sensed data and model proposal. The reconstruction is then done as an optimization problem. Doctoral thesis brings several new reconstruction models, further proposes method to debias this estimates and examines influence of compressed sensing onto perfusion parameters. Whole thesis is ended with extension of compressed sensing into three-dimensional data. Here, the influence of reconstruction onto perfusion parameters is also described. In summary, the thesis shows that due to compressed sensing, temporal resolution can be increased with the fixed spatial resolution or spatial resolution can be increased with the fixed temporal resolution.
Optimization using derivative-free and metaheuristic methods
Márová, Kateřina ; Tichý, Petr (advisor) ; Šmídl, Václav (referee)
Evolutionary algorithms have proved to be useful for tackling many practical black-box optimization problems. In this thesis, we describe one of the most powerful evolutionary algorithms of today, CMA- ES, and apply it in novel way to solve the problem of tuning multiple coupled PID controllers in combustion engine models. Powered by TCPDF (www.tcpdf.org)
Software environment for data assimilation in radiation protection
Majer, Peter ; Šmídl, Václav (advisor) ; Hofman, Radek (referee)
In this work we apply data assimilation onto meteorological model WRF for local domain. We use bayesian statistics, namely Sequential Monte Carlo method combined with particle filtering. Only surface wind data are considered. An application written in Python programming language is also part of this work. This application forms interface with WRF, performs data assimilation and provides set of charts as output of data assimilation. In case of stable wind conditions, wind predictions of assimilated WRF are significantly closer to measured data than predictions of non-assimilated WRF. In this kind of conditions, this assimilated model can be used for more accurate short-term local weather predictions. Powered by TCPDF (www.tcpdf.org)
Extending Horizon of Finite Control Set MPC of PMSM Drive with Input LC Filter using LQ Lookahead
Šmídl, Václav ; Janouš, Š. ; Peroutka, Z.
Finite control set model predictive control (FS-MPC) has been shown to be a very effective approach to control of PMSM drives. FS-MPC is a very flexible tool since it can evaluate an arbitrary loss function. However, design of the appropriate loss function for the problem can be a challenge especially when the design input is visible only on the long horizon. An example where this problem becomes apparent is the main propulsion drive of a traction vehicle fed from a dc catenary. Specifically, the catenary voltage is subject to short circuits, fast changes, harmonics and other disturbances which can vary in very wide range. Therefore, the drive is equipped with the trolley-wire input LC filter. The filter is almost undamped by design in order to achieve maximum efficiency and the control strategy needs to secure active damping of the filter to guarantee the drive stability. While it is possible to introduce active damping terms to the loss function, it is hard to predict its properties.
Data assimilation methods used in the ASIM module
Šmídl, Václav ; Hofman, Radek ; Pecha, Petr
Comprehensive review of data assimilation methods for improvement of the model predictions of consequences of a radiation accident. The methods of the numerical optimization techniques and sequential Monte Carlo techniques are implemented. Applications in the field of consequence assessments are presented.
Convolution Model of Time-activity Curves in Blind Source Separation
Tichý, Ondřej ; Šmídl, Václav
Availability of input and organ functions is a prerequisite for analysis of dynamic image sequences in scintigraphy and positron emission tomography (PET) via kinetic models. In PET, the input function can be directly measured by sampling the arterial blood. This invasive procedure can be substituted by extraction of the input function from the observed images. Standard procedure for the extraction is based on manual selection of a region of interest (ROI) which is user-dependent and inaccurate. The aim of our contribution is to demonstrate a new procedure for simultaneous estimation of the input and organ functions from the observed image sequence. We design a mathematical model that integrates all common assumption of the domain, including convolution of the input function and tissue-specific kernels. The input function as well as the kernel parameters are considered to be unknown. They are estimated from the observed images using the Variational Bayes method.

National Repository of Grey Literature : 26 records found   1 - 10nextend  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.