National Repository of Grey Literature 26 records found  beginprevious17 - 26  jump to record: Search took 0.00 seconds. 
The use of coherent risk measures in operational risk modeling
Lebovič, Michal ; Teplý, Petr (advisor) ; Doležel, Pavel (referee)
The debate on quantitative operational risk modeling has only started at the beginning of the last decade and the best-practices are still far from being established. Estimation of capital requirements for operational risk under Advanced Measurement Approaches of Basel II is critically dependent on the choice of risk measure, which quantifies the risk exposure based on the underlying simulated distribution of losses. Despite its well-known caveats Value-at-Risk remains a predominant risk measure used in the context of operational risk management. We describe several serious drawbacks of Value-at-Risk and explain why it can possibly lead to misleading conclusions. As a remedy we suggest the use of coherent risk measures - and namely the statistic known as Expected Shortfall - as a suitable alternative or complement for quantification of operational risk exposure. We demonstrate that application of Expected Shortfall in operational loss modeling is feasible and produces reasonable and consistent results. We also consider a variety of statistical techniques for modeling of underlying loss distribution and evaluate extreme value theory framework as the most suitable for this purpose. Using stress tests we further compare the robustness and consistency of selected models and their implied risk capital estimates...
Unemployment Duration in the Czech Republic Through the Lens of Survival Analysis
Čabla, Adam ; Malá, Ivana (advisor) ; Komárková, Lenka (referee) ; Popelka, Jan (referee)
In the presented thesis the aim is to apply methods of survival analysis to the data from the Labour Force Survey, which are interval-censored. With regard to this type of data, I use specific methods designed to handle them, especially Turnbull estimate, weighted log-rank test and the AFT model. Other objective of the work is the design and application of a methodology for creating a model of unemployment duration, depending on the available factors and its interpretation. Other aim is to evaluate evolution of the probability distribution of unemployment duration and last but not least aim is to create more accurate estimate of the tail using extreme value theory. The main benefits of the thesis can include the creation of a methodology for examining the data from the Labour Force Survey based on standard techniques of survival analysis. Since the data are internationally comparable, the methodology is applicable at the level of European Union countries and several others. Another benefit of this work is estimation of the parameters of the generalized Pareto distribution on interval-censored data and creation and comparison of the models of piecewise connected distribution functions with solution of the connection problem. Work brought empirical results, most important of which is the comparison of results from three different data approaches and specific relationship between selected factors and time to find a job or spell of unemployment.
Oceňování zajištění škodního nadměrku v neživotním pojištění
Hrevuš, Jan ; Marek, Luboš (advisor) ; Cipra, Tomáš (referee) ; Zimmermann, Pavel (referee)
Probably the most frequently used definition of reinsurance is insurance for insurance companies, by reinsurance the cedant (insurance company) cedes part of the risk to the reinsurer. Reinsurance plays nowadays a crucial role in insurance industry as it does not only reduce the reinsured's exposure, but it can also significantly reduce the required solvency capital. In past few decades various approaches to reinsurance actuarial modelling were published and many actuaries are nowadays just reinsurance specialized. The thesis provides an overview of the actuarial aspects of modelling a non-life per risk and for motor third party liability per event excess of loss reinsurance structure, according to the author's knowledge no study of such wide scope exists and various aspects have to be found in various fragmented articles published worldwide. The thesis is based on recent industry literature describing latest trends and methodologies used, the theory is compared with the praxis as the author has working experience from underwriting at CEE reinsurer and actuarial reinsurance modelling at global reinsurance broker. The sequence of topics which are dealt corresponds to sequence of the steps taken by actuary modelling reinsurance and each step is discussed in detail. Starting with data preparation and besides loss inflation, more individual claims development methods are introduced and own probabilistic model is constructed. Further, burning cost analysis and probabilistic rating focused on heavy tailed distributions are discussed. A special attention is given to exposure rating which is not commonly known discipline among actuaries outside of reinsurance industry and different methodologies for property and casualty exposure modelling are introduced including many best practice suggestions. All main approaches to the reinsurance modelling are also illustrated on either real or realistically looking data, similar to those provided by European insurance companies to their reinsurers during renewal periods.
Modelování extrémních hodnot
Shykhmanter, Dmytro ; Malá, Ivana (advisor) ; Luknár, Ivan (referee)
Modeling of extreme events is a challenging statistical task. Firstly, there is always a limit number of observations and secondly therefore no experience to back test the result. One way of estimating higher quantiles is to fit one of theoretical distributions to the data and extrapolate to the tail. The shortcoming of this approach is that the estimate of the tail is based on the observations in the center of distribution. Alternative approach to this problem is based on idea to split the data into two sub-populations and model body of the distribution separately from the tail. This methodology is applied to non-life insurance losses, where extremes are particularly important for risk management. Never the less, even this approach is not a conclusive solution of heavy tail modeling. In either case, estimated 99.5% percentiles have such high standard errors, that the their reliability is very low. On the other hand this approach is theoretically valid and deserves to be considered as one of the possible methods of extreme value analysis.
Studie teoretické predikovatelnosti extremálních rozdělení pro přírodní katastrofy
Sabolová, Radka ; Zimmermann, Pavel (advisor) ; Kladívko, Kamil (referee)
The thesis deals with natural disasters from the statistical point of view and treats them as extremal observations. Basics of classical extreme value theory will be summarized and new approach based on maximum entropy principle will be proposed. Both methods will be used in order to analyze real discharge data observed at the river Vltava.
Napětí na devizovém trhu: měření pomocí teorie extrémních hodnot
Zuzáková, Barbora ; Mandel, Martin (advisor) ; Benecká, Soňa (referee)
This thesis discusses the phenomenon of currency crises, in particular it is devoted to empirical identification of crisis periods. As a crisis indicator, we aim to utilize an exchange market pressure index which has been revealed as a very powerful tool for the exchange market pressure quantification. Since enumeration of the exchange market pressure index is crucial for further analysis, we pay special attention to different approaches of its construction. In the majority of existing literature on exchange market pressure models, a currency crisis is defined as a period of time when the exchange market pressure index exceeds a predetermined level. In contrast to this, we incorporate a probabilistic approach using the extreme value theory. Our goal is to prove that stochastic methods are more accurate, in other words they are more reliable instruments for crisis identification. We illustrate the application of the proposed method on a selected sample of four central European countries over the period 1993 - 2012, or 1993 - 2008 respectively, namely the Czech Republic, Hungary, Poland and Slovakia. The choice of the sample is motivated by the fact that these countries underwent transition reforms to market economies at the beginning of 1990s and therefore could have been exposed to speculative attacks on their newly arisen currencies. These countries are often assumed to be relatively homogeneous group of countries at similar stage of the integration process. Thus, a resembling development of exchange market pressure, particularly during the last third of the estimation period, would not be surprising.
Trhy s elektrickou energií a modelování v řízení rizik
Paholok, Igor ; Málek, Jiří (advisor) ; Kodera, Jan (referee) ; Budinský, Petr (referee)
The main target of this thesis is to summarize and explain the specifics of power markets and test application of models, which might be used especially in risk management area. Thesis starts with definition of market subjects, typology of traded contracts and description of market development with focus on Czech Republic. Thesis continues with development of theoretical concepts of short term/spot electricity markets and potential link between spot and forward electricity markets. After deriving of those microeconomic fundamental models we continue with stochastic models (Jump Diffusion Mean Reverting process and Extreme Value Theory) in order to depict patterns of spot and forward power contracts price volatility. Last chapter deals with credit risk specifics of power trading and develops model (using concept known as Credit Value Adjustment) to compare economic efficiency of OTC and exchange power trading. Developed and described models are tested on selected power markets, again with focus on Czech power market data set.
Identification of Asset Price Misalignments on Financial Markets With Extreme Value Theory
Kadlčáková, Narcisa ; Komárek, Luboš ; Komárková, Zlatuše ; Hlaváček, Michal
This paper examines the potential for concurrence of crises in the foreign exchange, stock, and government bond markets as well as identifying asset price misalignments from equilibrium for three Central European countries and the euro area. Concurrence is understood as the joint occurrence of extreme asset changes in different countries and is assessed with a measure of the asymptotic tail dependence among the distributions studied. However, the main aim of the paper is to examine the potential for concurrence of misalignments from equilibrium among financial markets. To this end, representative assets are linked to their fundamentals using a cointegration approach. Next, the extreme values of the differences between the actual daily exchange rates and their monthly equilibrium values determine the episodes associated with large departures from equilibrium. Using tools from Extreme Value Theory, we analyze the transmission of both standard crisis and misalignment-from-equilibrium formation events in the foreign exchange, stock, and government bond markets examined. The results reveal significant potential for co-alignment of extreme events in these markets in Central Europe. The evidence for co-movements is found to be very weak for the exchange rates, but is stronger for the stock markets and bond markets in some periods.
Fulltext: Download fulltextPDF
Extreme Value Theory in Operational Risk Management
Vojtěch, Jan ; Kahounová, Jana (advisor) ; Řezanková, Hana (referee) ; Orsáková, Martina (referee)
Currently, financial institutions are supposed to analyze and quantify a new type of banking risk, known as operational risk. Financial institutions are exposed to this risk in their everyday activities. The main objective of this work is to construct an acceptable statistical model of capital requirement computation. Such a model must respect specificity of losses arising from operational risk events. The fundamental task is represented by searching for a suitable distribution, which describes the probabilistic behavior of losses arising from this type of risk. There is a strong utilization of the Pickands-Balkema-de Haan theorem used in extreme value theory. Roughly speaking, distribution of a random variable exceeding a given high threshold, converges in distribution to generalized Pareto distribution. The theorem is subsequently used in estimating the high percentile from a simulated distribution. The simulated distribution is considered to be a compound model for the aggregate loss random variable. It is constructed as a combination of frequency distribution for the number of losses random variable and the so-called severity distribution for individual loss random variable. The proposed model is then used to estimate a fi -nal quantile, which represents a searched amount of capital requirement. This capital requirement is constituted as the amount of funds the bank is supposed to retain, in order to make up for the projected lack of funds. There is a given probability the capital charge will be exceeded, which is commonly quite small. Although a combination of some frequency distribution and some severity distribution is the common way to deal with the described problem, the final application is often considered to be problematic. Generally, there are some combinations for severity distribution of two or three, for instance, lognormal distributions with different location and scale parameters. Models like these usually do not have any theoretical background and in particular, the connecting of distribution functions has not been conducted in the proper way. In this work, we will deal with both problems. In addition, there is a derivation of maximum likelihood estimates of lognormal distribution for which hold F_LN(u) = p, where u and p is given. The results achieved can be used in the everyday practices of financial institutions for operational risks quantification. In addition, they can be used for the analysis of a variety of sample data with so-called heavy tails, where standard distributions do not offer any help. As an integral part of this work, a CD with source code of each function used in the model is included. All of these functions were created in statistical programming language, in S-PLUS software. In the fourth annex, there is the complete description of each function and its purpose and general syntax for a possible usage in solving different kinds of problems.
Application of isobars to stock market indices
Ivanková, Kristýna
Isobar surfaces, a method for describing the overall shape of multidimensional data, are estimated by nonparametric regression and used to evaluate the efficiency of selected markets based on returns of their stock market indices.

National Repository of Grey Literature : 26 records found   beginprevious17 - 26  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.