National Repository of Grey Literature 154 records found  previous11 - 20nextend  jump to record: Search took 0.00 seconds. 
A Bootstrap Comparison of Robust Regression Estimators
Kalina, Jan ; Janáček, Patrik
The ordinary least squares estimator in linear regression is well known to be highly vulnerable to the presence of outliers in the data and available robust statistical estimators represent more preferable alternatives.
From John Graunt to Adolphe Quetelet: on the Origins Of Demography
Kalina, Jan
John Graunt (1620-1674) and Adolphe Quetelet (1796-1874) were two important personalities, who contributed to the origins of demography. As they both developed statistical techniques for the analysis of demographic data, they are important also from the point of view of history of statistics. The contributions of both Graunt and Quetelet especially to the development of mortality tables and models are recalled in this paper. Already from the 17th century, the available mortality tables were exploited for computing life annuities. Also the contribution of selected personalities inspired by Graunt are recalled here, the work of Christian Huygens, Jacob Bernoulli, or Abraham de Moivre is discussed to document that the historical development of statistics and probability theory was connected with the development of demography.
The 2022 Election in the United States: Reliability of a Linear Regression Model
Kalina, Jan ; Vidnerová, Petra ; Večeř, M.
In this paper, the 2022 United States election to the House of Representatives is analyzed by means of a linear regression model. After the election process is explained, the popular vote is modeled as a response of 8 predictors (demographic characteristics) on the state-wide level. The main focus is paid to verifying the reliability of two obtained regression models, namely the full model with all predictors and the most relevant submodel found by hypothesis testing (with 4 relevant predictors). Individual topics related to assessing reliability that are used in this study include confidence intervals for predictions, multicollinearity, and also outlier detection. While the predictions in the submodel that includes only relevant predictors are very similar to those in the full model, it turns out that the submodel has better reliability properties compared to the full model, especially in terms of narrower confidence intervals for the values of the popular vote.
Some Robust Approaches to Reducing the Complexity of Economic Data
Kalina, Jan
The recent advent of complex (and potentially big) data in economics requires modern and effective tools for their analysis including tools for reducing the dimensionality (complexity) of the given data. This paper starts with recalling the importance of Big Data in economics and with characterizing the main categories of dimension reduction techniques. While there have already been numerous techniques for dimensionality reduction available, this work is interested in methods that are robust to the presence of outlying measurements (outliers) in the economic data. Particularly, methods based on implicit weighting assigned to individual observations are developed in this paper. As the main contribution, this paper proposes three novel robust methods of dimension reduction. One method is a dimension reduction within a robust regularized linear regression, namely a sparse version of the least weighted squares estimator. The other two methods are robust versions of feature extraction methods popular in econometrics: robust principal component analysis and robust factor analysis.
Kelly criterion and Bayesian statistics
Pardubický, Štěpán ; Večeř, Jan (advisor) ; Kalina, Jan (referee)
The classic problem of the investor is the search for profitable investment opportu- nities. But how should an investor behave if he finds such an opportunity? The Kelly criterion, named after the American scientist J.L. Kelly, answers this question. The crite- rion maximises the asymptotic exponential growth rate of capital in repeated bets, which it achieves by maximising the expected value of the logarithmic utility function. The criterion assumes a fixed investor's view of the true probability distribution. In practice, however, it is not clear how this opinion should be formed. In this paper, we combine the Kelly criterion with a Bayesian approach that allows to consider multiple opinions instead of a fixed opinion and let them be validated by the evolution of capital. Finally, we apply the findings to the investor's situation in the binomial market. 1
Robust regularized regression
Krett, Jakub ; Kalina, Jan (advisor) ; Maciak, Matúš (referee)
This thesis is devoted to introducing various types of robust and regularized regression estimates. The aim of the thesis is to present a new LWS-lasso estimate that combines robustness and regularization at the same time. Firstly, basic concepts of linear regression and modifications of the least squares are explained. Then, various robust and regula- rized estimates are introduced along with the new LWS-lasso estimate and its software implementation. Subsequently, selected estimates are compared to each other on real data and in a simulation study. 1
Robust regression and robust neural networks
Janáček, Patrik ; Kalina, Jan (advisor) ; Maciak, Matúš (referee)
The classical least squares approach in linear regression is prone to the presence of outliers in the data. The aim of this thesis is to present several robust alternatives to the least squares method in the linear regression framework and discuss their properties. Then robust neural networks based on these estimators are introduced and compared in a simulation study. In particular, the least weighted squares method with adaptive weights seems promising, as it is able to combine high robustness with efficiency in the absence of contamination in the data. 1
A Bootstrap Comparison of Robust Regression Estimators
Kalina, Jan ; Janáček, Patrik
The ordinary least squares estimator in linear regression is well known to be highly vulnerable to the presence of outliers in the data and available robust statistical estimators represent more preferable alternatives. It has been repeatedly recommended to use the least squares together with a robust estimator, where the latter is understood as a diagnostic tool for the former. In other words, only if the robust estimator yields a very different result, the user should investigate the dataset closer and search for explanations. For this purpose, a hypothesis test of equality of the means of two alternative linear regression estimators is proposed here based on nonparametric bootstrap. The performance of the test is presented on three real economic datasets with small samples. Robust estimates turn out not to be significantly different from non-robust estimates in the selected datasets. Still, robust estimation is beneficial in these datasets and the experiments illustrate one of possible ways of exploiting the bootstrap methodology in regression modeling. The bootstrap test could be easily extended to nonlinear regression models.
Recent Trends in Machine Learning with a Focus on Applications in Finance
Kalina, Jan ; Neoral, Aleš
Machine learning methods penetrate to applications in the analysis of financial data, particularly to supervised learning tasks including regression or classification. Other approaches, such as reinforcement learning or automated machine learning, are not so well known in the context of finance yet. In this paper, we discuss the advantages of an automated data analysis, which is beneficial especially if a larger number of datasets should be analyzed under a time pressure. Important types of learning include reinforcement learning, automated machine learning, or metalearning. This paper overviews their principles and recalls some of their inspiring applications. We include a discussion of the importance of the concept of information and of the search for the most relevant information in the field of mathematical finance. We come to the conclusion that a statistical interpretation of the results of theautomatic machine learning remains crucial for a proper understanding of the knowledge acquired by the analysis of the given (financial) data.
On kernel-based nonlinear regression estimation
Kalina, Jan ; Vidnerová, P.
This paper is devoted to two important kernel-based tools of nonlinear regression: the Nadaraya-Watson estimator, which can be characterized as a successful statistical method in various econometric applications, and regularization networks, which represent machine learning tools very rarely used in econometric modeling. This paper recalls both approaches and describes their common features as well as differences. For the Nadaraya-Watsonestimator, we explain its connection to the conditional expectation of the response variable. Our main contribution is numerical analysis of suitable data with an economic motivation and a comparison of the two nonlinear regression tools. Our computations reveal some tools for the Nadaraya-Watson in R software to be unreliable, others not prepared for a routine usage. On the other hand, the regression modeling by means of regularization networks is much simpler and also turns out to be more reliable in our examples. These also bring unique evidence revealing the need for a careful choice of the parameters of regularization networks

National Repository of Grey Literature : 154 records found   previous11 - 20nextend  jump to record:
See also: similar author names
75 KALINA, Jan
2 Kalina, Jakub
75 Kalina, Jan
2 Kalina, Jaroslav
4 Kalina, Jiří
4 Kalina, Josef
Interested in being notified about new results for this query?
Subscribe to the RSS feed.