National Repository of Grey Literature 4 records found  Search took 0.01 seconds. 
Variable selection based on penalized likelihood
Chlubnová, Tereza ; Kulich, Michal (advisor) ; Maciak, Matúš (referee)
Selection of variables and estimation of regression coefficients in datasets with the number of variables exceeding the number of observations consti- tutes an often discussed topic in modern statistics. Today the maximum penalized likelihood method with an appropriately selected function of the parameter as the penalty is used for solving this problem. The penalty should evaluate the benefit of the variable and possibly mitigate or nullify the re- spective regression coefficient. The SCAD and LASSO penalty functions are popular for their ability to choose appropriate regressors and at the same time estimate the parameters in a model. This thesis presents an overview of up to date results in the area of characteristics of estimates obtained by using these two methods for both small number of regressors and multidimensional datasets in a normal linear model. Due to the fact that the amount of pe- nalty and therefore also the choice of the model is heavily influenced by the tuning parameter, this thesis further discusses its selection. The behavior of the LASSO and SCAD penalty functions for different values and possibili- ties for selection of the tuning parameter is tested with various numbers of regressors on simulated datasets.
Maximum likelihood methods; selected problems
Chlubnová, Tereza ; Hlubinka, Daniel (advisor) ; Hlávka, Zdeněk (referee)
Maximum likelihood estimation is one of statistical methods for estimating an unknown parameter. It is often used because of a simple calculation of the estimator and also for characteristics of this estimator, which the method provides under some conditions. In the thesis we prove a consistence of the estimator under conditions of regularity and uniqueness of the root of the likelihood equation. If we add other assumptions we show its asymptotic normality and we expand this result from the one-dimensional parameter to the multi-dimensional parameter. The main result of the thesis lies in exercises, in which we cannot express the maximum likelihood estimator in general, but we can show its existence, uniqueness and asymptotic normality. Moreover we demonstrate the utilization of asymptotic normality of the estimator for asymptotic hypothesis tests and confidence intervals of the parameter. Powered by TCPDF (www.tcpdf.org)
Variable selection based on penalized likelihood
Chlubnová, Tereza ; Kulich, Michal (advisor) ; Maciak, Matúš (referee)
Selection of variables and estimation of regression coefficients in datasets with the number of variables exceeding the number of observations consti- tutes an often discussed topic in modern statistics. Today the maximum penalized likelihood method with an appropriately selected function of the parameter as the penalty is used for solving this problem. The penalty should evaluate the benefit of the variable and possibly mitigate or nullify the re- spective regression coefficient. The SCAD and LASSO penalty functions are popular for their ability to choose appropriate regressors and at the same time estimate the parameters in a model. This thesis presents an overview of up to date results in the area of characteristics of estimates obtained by using these two methods for both small number of regressors and multidimensional datasets in a normal linear model. Due to the fact that the amount of pe- nalty and therefore also the choice of the model is heavily influenced by the tuning parameter, this thesis further discusses its selection. The behavior of the LASSO and SCAD penalty functions for different values and possibili- ties for selection of the tuning parameter is tested with various numbers of regressors on simulated datasets.
Maximum likelihood methods; selected problems
Chlubnová, Tereza ; Hlubinka, Daniel (advisor) ; Hlávka, Zdeněk (referee)
Maximum likelihood estimation is one of statistical methods for estimating an unknown parameter. It is often used because of a simple calculation of the estimator and also for characteristics of this estimator, which the method provides under some conditions. In the thesis we prove a consistence of the estimator under conditions of regularity and uniqueness of the root of the likelihood equation. If we add other assumptions we show its asymptotic normality and we expand this result from the one-dimensional parameter to the multi-dimensional parameter. The main result of the thesis lies in exercises, in which we cannot express the maximum likelihood estimator in general, but we can show its existence, uniqueness and asymptotic normality. Moreover we demonstrate the utilization of asymptotic normality of the estimator for asymptotic hypothesis tests and confidence intervals of the parameter. Powered by TCPDF (www.tcpdf.org)

Interested in being notified about new results for this query?
Subscribe to the RSS feed.