National Repository of Grey Literature 2 records found  Search took 0.00 seconds. 
Entropy and discrete distributions
Kuc, Petr ; Jurečková, Jana (advisor) ; Hlubinka, Daniel (referee)
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that we are given when we observe a ran- dom variable with the distribution. In this thesis we introduce more extensive concept of the term of information entropy and we introduce Shannon entropy as an important special case. Then we compute Shannon entropy for some specific probability distributions, show which distributions have maximal en- tropy under given constraints and we introduce the principle of maximum entropy as a useful estimate of probability models. Another topic of this thesis is an introduction of the principle of minimum divergence by which we can arbitrarily accurately estimate an unknown probability distribution of the given random variable if we have a sufficiently long random sample. Finally we prove that the binomial distribution converges to the Poisson distribution in the Shannon divergence. 1
Statistical processing of experimental data
NAVRÁTIL, Pavel
This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.