20191019 16:35 
Discrete Dynamic Endogenous Growth Model: Derivation, Calibration and Simulation
Kodera, J. ; Van Tran, Q. ; Vošvrda, Miloslav
Endogenous economic growth model were developed to improve traditional growth models with exogenous technological changes. There are several approaches how to incorporate technological progress into a growth model. Romer was the first author who has introduced it by expanding the variety of intermediate goods. Overall, the growth models are often continuous. In our paper we formulate a discrete version of Romer's model with endogenous technological change based on expanding variety of intermediates, both in the final good sector and in the researchdevelopment sector, where the target is to maximize present value of the returns from discovering of intermediate goods which should prevail introducing costs. These discrete version then will be calibrated by a numerical example. Our aim is to find the solution and analyse the development of economic variables with respect to external changes.
Detailed record

20190826 09:04 
How to downweight observations in robust regression: A metalearning study
Kalina, Jan ; Pitra, Zbyněk
Metalearning is becoming an increasingly important methodology for extracting knowledge from a data base of available training data sets to a new (independent) data set. The concept of metalearning is becoming popular in statistical learning and there is an increasing number of metalearning applications also in the analysis of economic data sets. Still, not much attention has been paid to its limitations and disadvantages. For this purpose, we use various linear regression estimators (including highly robust ones) over a set of 30 data sets with economic background and perform a metalearning study over them as well as over the same data sets after an artificial contamination.
Detailed record

20190826 09:04 
Nonparametric Bootstrap Techniques for Implicitly Weighted Robust Estimators
Kalina, Jan
The paper is devoted to highly robust statistical estimators based on implicit weighting, which have a potential to find econometric applications. Two particular methods include a robust correlation coefficient based on the least weighted squares regression and the minimum weighted covariance determinant estimator, where the latter allows to estimate the mean and covariance matrix of multivariate data. New tools are proposed allowing to test hypotheses about these robust estimators or to estimate their variance. The techniques considered in the paper include resampling approaches with or without replacement, i.e. permutation tests, bootstrap variance estimation, and bootstrap confidence intervals. The performance of the newly described tools is illustrated on numerical examples. They reveal the suitability of the robust procedures also for noncontaminated data, as their confidence intervals are not much wider compared to those for standard maximum likelihood estimators. While resampling without replacement turns out to be more suitable for hypothesis testing, bootstrapping with replacement yields reliable confidence intervals but not corresponding hypothesis tests.
Detailed record

20190826 09:04 
Experimental comparison of traffic flow models on traffic data
Přikryl, Jan ; Horňák, Ivan
Despite their deficiencies, continuous secondorder traffic flow models are still commonly used to derive discretetime models that help traffic engineers to model and predict traffic oflow behaviour on highways. We brie fly overview the development of traffic flow theory based on continuous flowdensity models of LighthillWhithamRichards (LWR) type, that lead to the secondorder model of AwRascle. We will then concentrate on widelyadopted discrete approximation to the LWR model by Daganzo's Cell Transmission Model. Behaviour of the discussed models will be demonstrated by comparing the traffic flow prediction based on these models with real traffic data on the southern highway ring of Prague.
Detailed record

20190826 09:04 
Question Selection Methods for Adaptive Testing with Bayesian Networks
Plajner, Martin ; Magauina, A. ; Vomlel, Jiří
The performance of Computerized Adaptive Testing systems, which are used for testing of human knowledge, relies heavily on methods selecting correct questions for tested students. In this article we propose three different methods selecting questions with Bayesian networks as students’ models. We present the motivation to use these methods and their mathematical description. Two empirical datasets, paper tests of specific topics in mathematics and Czech language for foreigners, were collected for the purpose of methods’ testing. All three methods were tested using simulated testing procedure and results are compared for individual methods. The comparison is done also with the sequential selection of questions to provide a relation to the classical way of testing. The proposed methods are behaving much better than the sequential selection which verifies the need to use a better selection method. Individually, our methods behave differently, i.e., select different questions but the success rate of model’s predictions is very similar for all of them. This motivates further research in this topic to find an ordering between methods and to find the best method which would provide the best possible selections in computerized adaptive tests.
Detailed record

20190826 09:04 
A Note on Optimal Value of Loans
Kaňková, Vlasta
People try to gain (in the last decades) own residence (a flat or a little house). Since young people do not posses necessary financial resources, bank sector offers them a mortgage. Of course, the aim of any bank is to profit from such a transaction. Therefore, according to their possibilities, the banks employ excellent experts to analyze the financial situation of potenitial clients. Consequently, the banks know what could be a maximal size of the loan (in dependence on the debtor's position, salary and age) and what is reasonable size of installments. The aim of this contribution is to analyze the situation from the second size. In particular, the aim is to investgate the possibilities of the debtors not only on the dependence on their present  day situation, but also on their future private and subjective decisions and on possible "unpleasant" events. Moreover, consequently according to these indexes, the aim of this contribution is to suggest a method for a recognition of a"safe" loan and simultaneously to offer tactics to state a suitable environment for future time.The stochastic programming theory will be employed to it.
Detailed record

20190826 09:04 
Bimodality testing of the stochastic cusp model
Voříšek, Jan
Multimodal distributions are popular in many areas: biology (fish and shark population), engineering (material collapse under pressure, stability of ships), psychology (attitude transitions), physics (freezing of water) etc. There were a few attempts to utilize multimodal distributions in financial mathematics as well. Cobb et al. described a class of multimodal distributions belonging to the exponential family, which has unique maximum likelihood estimators and showed a connection to the stationary distribution of the stochastic cusp catastrophe model. Moreover was shown, how to identify bimodality for given parameters of the stochastic cusp model using the sign of Cardans discriminant. A statistical test for bimodality of the stochastic cusp model using maximum likelihood estimates is proposed in the paper as well as the necessary condition for bimodality which can be used for s simplified testing to reject bimodality. By proposed methods is tested the bimodality of exchange rate between USD and GBP in the periods within the years 1975  2014.
Detailed record

20190826 09:04 
Approximate Transition Density Estimation of the Stochastic Cusp Model
Voříšek, Jan
Stochastic cusp model is defined by stochastic differential equation with cubic drift. Its stationary density allows for skewness, different tail shapes and bimodality. There are two stable equilibria in bimodality case and movement from one equilibrium to another is interpreted as a crash. Qualitative properties of the cusp model were employed to model crashes on financial markets, however, practical applications of the model employed the stationary distribution, which does not take into account the serial dependence between observations. Because closedform solution of the transition density is not known, one has to use approximate technique to estimate transition density. This paper extends approximate maximum likelihood method, which relies on the closedform expansion of the transition density, to incorporate timevarying parameters of the drift function to be driven by market fundamentals. A measure to predict endogenous crashes of the model is proposed using transition density estimates. Empirical example estimates Iceland Krona depreciation with respect to the British Pound in the year 2001 using differential of interbank interest rates as a market fundamental.
Detailed record

20190826 09:04 
Capital market efficiency in the Ising model environment: Local and global effects
Krištoufek, Ladislav ; Vošvrda, Miloslav
Financial Ising model is one of the simplest agentbased models (building on a parallel between capital markets and the Ising model of ferromag netism) mimicking the most important stylized facts of financial returns such as no serial correlation, fat tails, volatility clustering and volatility persistence on the verge of nonstationarity. We present results of Monte Carlo simulation study investigating the relationship between parameters of the model (related to herding and minority game behaviors) and crucial characteristics of capital market e ciency (with respect to the e cient market hypothesis). We find a strongly nonlinear relationship between these which opens possibilities for further research. Specifically, the existence of both herding and minority game behavior of market participants are necessary for attaining the e cient market in the sense of the e cient market hypothesis.
Detailed record

20190826 09:04 
Causality and Intervention in Business Process Management
Bína, V. ; Jiroušek, Radim
The paper presents an algebraic approach to the modeling of causality in systems of stochastic variables. The methodology is based on an operator of a composition that provides the possibility of composing a multidimensional distribution from lowdimensional building blocks taking advantage of the dependence structure of the problem variables. The authors formally define and demonstrate on a hypothetical example a surprisingly elegant unifying approach to conditioning by a single variable and the evaluation of the eﬀect of an intervention. Both operations are realized by the composition with a degenerated distribution and diﬀer only in the sequence in which the operator of the composition is performed.
Detailed record



