National Repository of Grey Literature 3 records found  Search took 0.00 seconds. 
Multivariate goodness-of-fit tests
Kuc, Petr ; Hlávka, Zdeněk (advisor) ; Antoch, Jaromír (referee)
In this thesis we introduce, implement and compare several multivariate goodness-of-fit tests. First of all, we will focus on universal mul- tivariate tests that do not place any assumptions on parametric families of null distributions. Thereafter, we will be concerned with testing of multi- variate normality and, by using Monte Carlo simulations, we will compare power of five different tests of bivariate normality against several alternati- ves. Then we describe multivariate skew-normal distribution and propose a new test of multivariate skew-normality based on empirical moment genera- ting functions. In the final analysis, we compare its power with other tests of multivariate skew-normality. 1
Multivariate goodness-of-fit tests
Kuc, Petr ; Hlávka, Zdeněk (advisor) ; Antoch, Jaromír (referee)
In this thesis we introduce, implement and compare several multivariate goodness-of-fit tests. First of all, we will focus on universal mul- tivariate tests that do not place any assumptions on parametric families of null distributions. Thereafter, we will be concerned with testing of multi- variate normality and, by using Monte Carlo simulations, we will compare power of five different tests of bivariate normality against several alternati- ves. Then we describe multivariate skew-normal distribution and propose a new test of multivariate skew-normality based on empirical moment genera- ting functions. In the final analysis, we compare its power with other tests of multivariate skew-normality. 1
Entropy and discrete distributions
Kuc, Petr ; Jurečková, Jana (advisor) ; Hlubinka, Daniel (referee)
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that we are given when we observe a ran- dom variable with the distribution. In this thesis we introduce more extensive concept of the term of information entropy and we introduce Shannon entropy as an important special case. Then we compute Shannon entropy for some specific probability distributions, show which distributions have maximal en- tropy under given constraints and we introduce the principle of maximum entropy as a useful estimate of probability models. Another topic of this thesis is an introduction of the principle of minimum divergence by which we can arbitrarily accurately estimate an unknown probability distribution of the given random variable if we have a sufficiently long random sample. Finally we prove that the binomial distribution converges to the Poisson distribution in the Shannon divergence. 1

Interested in being notified about new results for this query?
Subscribe to the RSS feed.