National Repository of Grey Literature 13 records found  1 - 10next  jump to record: Search took 0.00 seconds. 
Stochastic Catastrophe Model Cusp
Voříšek, Jan ; Vošvrda, Miloslav (advisor) ; Lukáš, Ladislav (referee) ; Lachout, Petr (referee)
Title: Stochastic Catastrophe Model Cusp Author: Jan Voříšek Department: Department of Probability and Mathematical Statistics Supervisor: Prof. Ing. Miloslav Vošvrda, CSc., Czech Academy of Sciences, Institute of Information Theory and Automation Abstract: The goal of this thesis is to analyze the stochastic cusp model. This task is divided into two main topics. The first of them concentrates on the stationary density of the cusp model and statistical testing of its bimodality, where power and size of the proposed tests are simulated and compared with the dip test of unimodality. The second main topic deals with the transition density of the stochastic cusp model. Comparison of approximate maximum likelihood approach with traditional finite difference and numerical simulations indicates its advantage in terms of speed of estimation. An approximate Fisher information matrix of general stochastic process is derived. An application of the cusp model to the exchange rate with time-varying parameters is estimated, the extension of the cusp model into stochastic bimodality model is proposed, and the measure of probability of intrinsic crash of the cusp model is suggested. Keywords: stochastic cusp model, bimodality testing, transition density ap- proximation
Stochastic Catastrophe Model Cusp
Voříšek, Jan ; Vošvrda, Miloslav (advisor) ; Lukáš, Ladislav (referee) ; Lachout, Petr (referee)
Title: Stochastic Catastrophe Model Cusp Author: Jan Voříšek Department: Department of Probability and Mathematical Statistics Supervisor: Prof. Ing. Miloslav Vošvrda, CSc., Czech Academy of Sciences, Institute of Information Theory and Automation Abstract: The goal of this thesis is to analyze the stochastic cusp model. This task is divided into two main topics. The first of them concentrates on the stationary density of the cusp model and statistical testing of its bimodality, where power and size of the proposed tests are simulated and compared with the dip test of unimodality. The second main topic deals with the transition density of the stochastic cusp model. Comparison of approximate maximum likelihood approach with traditional finite difference and numerical simulations indicates its advantage in terms of speed of estimation. An approximate Fisher information matrix of general stochastic process is derived. An application of the cusp model to the exchange rate with time-varying parameters is estimated, the extension of the cusp model into stochastic bimodality model is proposed, and the measure of probability of intrinsic crash of the cusp model is suggested. Keywords: stochastic cusp model, bimodality testing, transition density ap- proximation
Statistical Approaches to Short-term Electricity Forecasting
Kellová, Andrea ; Kočenda, Evžen (advisor) ; Vošvrda, Miloslav (referee) ; Lukáš, Ladislav (referee)
Andrea Kellová Statistical Approaches to Short-term Electricity Forecasting Abstrakt v češtině - dokument nenalezen
Pricing and modeling credit risk
Kolman, Marek ; Witzany, Jiří (advisor) ; Kodera, Jan (referee) ; Lukáš, Ladislav (referee)
The thesis covers a wide range of topics from the credit risk modeling with the emphasis put on pricing of the claims subject to the default risk. Starting with a separate general contingent claim pricing framework the key topics are classified into three fundamental parts: firm-value models, reduced-form models, portfolio problems, with a possible finer sub-classification. Every part provides a theoretical discussion, proposal of self-developed methodologies and related applications that are designed so as to be close to the real-world problems. The text also reveals several new findings from various fields of credit risk modeling. In particular, it is shown (i) that the stock option market is a good source of credit information, (ii) how the reduced-form modeling framework can be extended to capture more complicated problems, (iii) that the double t copula together with a self-developed portfolio modeling framework outperforms the classical Gaussian copula approaches. Many other, partial findings are presented in the relevant chapters and some other results are also discussed in the Appendix.
Consequences of assumption violations of selected statistical methods
Marcinko, Tomáš ; Blatná, Dagmar (advisor) ; Malá, Ivana (referee) ; Lukáš, Ladislav (referee)
Classical parametric methods of statistical inference and hypothesis testing are derived under fundamental theoretical assumptions, which may or may not be met in real world applications. However, these methods are usually used despite the violation of their underlying assumptions, while it is argued, that these methods are quite insensitive to the violation of relevant assumptions. Moreover, alternative nonparametric or rank tests are often overlooked, mostly because these methods may be deemed to be less powerful then parametric methods. The aim of the dissertation is therefore a description of the consequences of assumption violations concerning classical one-sample and two-sample statistical methods and a consistent and comprehensive comparison of parametric, nonparametric and robust statistical techniques, which is based on extensive simulation study and focused mostly on a normality and heteroscedasticity assumption violation. The results of the simulation study confirmed that the classical parametric methods are relatively robust, with some reservations in case of outlying observations, when traditional methods may fail. On the other hand, the empirical study clearly proved that the classical parametric methods are losing their optimal properties, when the underlying assumptions are violated. For example, in many cases of non-normality the appropriate nonparametric and rank-based methods are more powerful, and therefore a statement, that these methods are unproductive due to their lack of power may be considered a crucial mistake. However, the choice of the most appropriate distribution-free method generally depends on the particular form of the underlying distribution.
Search of the most suitable method of estimation of output gap for the czech economy
Kloudová, Dana ; Brožová, Dagmar (advisor) ; Mirvald, Michal (referee) ; Lukáš, Ladislav (referee)
By monetary policy decisions, central banks use output gap to keep macroeconomic variables at their natural levels. A substantial disadvantage of this variable is the fact that it is an unobservable variable which is very problematic to measure, although it is possible to estimate it with various methods of estimation. This thesis aims to find the most suitable method of estimation for Czech economy. Thirteen methods have been chosen for this aim: linear trend, quadratic trend, HP filter, band-pass filters, robust trend, univariate unobserved component model, two types of production function, two SVAR models, multivariate HP filter and multivariate unobserved component model. Own estimations have shown that estimated trajectories of unobservable states were not identical. For own selection of the most suitable method of estimation, quantitative (ability to forecast inflation ,a growth of product and data revisions by selected national and international organisations) and qualitative criterions (qualities of methods of estimation, transparency and easy application) have been selected, where emphasis was put on quantitative criterions. Results of this thesis will show that the most suitable method of estimation output gap for Czech economy is multivariate unobserved component model.
Oceňování derivátů v postkrizovém období / Post crisis valuation of derivatives
Baran, Jaroslav ; Witzany, Jiří (advisor) ; Mandel, Martin (referee) ; Lukáš, Ladislav (referee)
In this study we analyse relationship between classical approach to valuation of linear interest rate derivatives and post-crisis approach when the valuation better reflects credit and liquidity risk and economic costs of the transaction on top of the risk-free rate. We discuss the method of collateralization to diminish counterparty credit risk, its impact on derivatives pricing, and how overnight indexed swap (OIS) rates became market standard for discounting future derivatives' cash flows. We show that using one yield curve to both estimating the forward rates and discounting the expected future cash flows is no longer possible in arbitrage free market. We review in detail three fundamental interest rate derivatives (interest rate swap, basis swap and cross-currency swap) and we derive discount factors used for calculating the present value of expected future cash flows that are consistent with market quotes. We also investigate drivers behind basis spreads, in particular, credit and liquidity risk, and supply and demand forces, and show how they impact valuation of derivatives. We analyse Czech swap rates and propose an estimation of CZK OIS curve and approximate discount rates in case of cross-currency swaps. Finally, we discuss inflation markets and consistent valuation of inflation swaps.
Bayesovský odhad DSGE modelů
Bouda, Milan ; Pánková, Václava (advisor) ; Kodera, Jan (referee) ; Lukáš, Ladislav (referee)
Thesis is dedicated to Bayesian Estimation of DSGE Models. Firstly, the history of DSGE modeling is outlined as well as development of this macroeconometric field in the Czech Republic and in the rest of the world. Secondly, the comprehensive DSGE framework is described in detail. It means that everyone is able to specify or estimate arbitrary DSGE model according to this framework. Thesis contains two empirical studies. The first study describes derivation of the New Keynesian DSGE Model and its estimation using Bayesian techniques. This model is estimated with three different Taylor rules and the best performing Taylor rule is identified using the technique called Bayesian comparison. The second study deals with development of the Small Open Economy Model with housing sector. This model is based on previous study which specifies this model as a closed economy model. I extended this model by open economy features and government sector. Czech Republic is generally considered as a small open economy and these extensions make this model more applicable to this economy. Model contains two types of households. The first type of consumers is able to access the capital markets and they can smooth consumption across time by buying or selling financial assets. These households follow the permanent income hypothesis (PIH). The other type of household uses rule of thumb (ROT) consumption, spending all their income to consumption. Other agents in this economy are specified in standard way. Outcomes of this study are mainly focused on behavior of house prices. More precisely, it means that all main outputs as Bayesian impulse response functions, Bayesian prediction and shock decomposition are focused mainly on this variable. At the end of this study one macro-prudential experiment is performed. This experiment comes up with answer on the following question: is the higher/lower Loan to Value (LTV) ratio better for the Czech Republic? This experiment is very conclusive and shows that level of LTV does not affect GDP. On the other hand, house prices are very sensitive to this LTV ratio. The recommendation for the Czech National Bank could be summarized as follows. In order to keep house prices less volatile implement rather lower LTV ratio than higher.
Capital Asset Price Modelling: Concept VAPM
Kuklik, Robert G. ; Janda, Karel (advisor) ; Kodera, Jan (referee) ; Lukáš, Ladislav (referee)
The key objective of this thesis is the outline of an alternative capital market modeling framework, the Volatility Asset Pricing Model, VAPM, inspired by the innovative dual approach of Mandelbrot and Hudson using the method based on synthesis of two seemingly antagonistic factors -- the volatility of market prices and their serial dependence determining the capital markets' dynamics. The pilot tests of this model in various periods using the market index as well as a portfolio of selected securities delivered generally satisfactory results. Firstly, the work delivers a brief recapitulation regarding the concepts of a consumer/investor choice under general conditions of hypothetical certainty. Secondly, this outline is then followed by a description of the "classical" methodologies in the risky environment of uncertainty, with assessment of their corresponding key models, i.e. the CAPM, SIM, MIM, APTM, etc., notwithstanding results of the related testing approaches. Thirdly, this assessment is based on evaluation of the underlying doctrine of Efficient Market Hypothesis in relation to the so called Random Walk Model. Fourthly, in this context the work also offers a brief exposure to a few selected tests of these contraversial concepts. Fifthly, the main points of conteporary approaches such as the Fractal Dimension and the Hurst Exponent in the dynamic framework of information entropy are subsequently described as the theoretical tools leading to development of the abovementioned model VAPM. The major contribution of this thesis is considered its attempt to apply the abovementioned concepts in practice, with the intention to possibly inspire a further analytical research.
Alternativní přístup k řízení rizik v bankovnictví za použití analýzy obalu dat
Fialová, Zuzana ; Jablonský, Josef (advisor) ; Lukáš, Ladislav (referee) ; Brezina, Ivan (referee)
The implementation of the Basel II capital adequacy framework promoted internally modelled risk parameters and allowed banks to build their own models. The recent crisis pointed at the gaps in the Basel II Accord, seeing banks having trouble to deal with lack of liquidity and higher default rates. The minimum regulatory capital held by the banks turned out to be insufficient and banks started looking for other techniques to better quantify the risks they are exposed to. Model accuracy is a key objective to meet the capital adequacy requirements while facing severe economic conditions. The purpose of this thesis is to suggest a new approach to credit modelling. Data envelopment analysis (DEA) can overcome some the difficulties that the banks deal with. The key opportunity in using DEA and its modifications is in the fact that this method does not require prior information about the classification between good and bad units and only requires financial and other data about the client in question. This thesis analyses the performance of DEA applied on a real world portfolio of corporate loans compared to the two standard methods used in the banking sector. Logistic regression is the most popular method, having few restrictions and providing output in the form of a probability of default. The second method is the discriminant analysis giving similar results to the logistic regression but being based on more assumptions. The model is validated by comparing the model output with the actual status and its predictive power evaluated.

National Repository of Grey Literature : 13 records found   1 - 10next  jump to record:
See also: similar author names
2 Lukáš, Lubomír
Interested in being notified about new results for this query?
Subscribe to the RSS feed.