National Repository of Grey Literature 2 records found  Search took 0.00 seconds. 
Optimální podmínky pro maximalizaci informační divergence exponenciální rodiny
Matúš, František
The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q in E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E.
On maximization of the information divergence from an exponential family
Matúš, František ; Ay, N.
The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q in E. For convex exponential families the local maximizers of this function of P are found. General exponential family E of dimension d is enlarged to an exponential family E* of the dimension at most 3d+2 such that the local maximizers are of zero divergence from E*.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.