National Repository of Grey Literature 485 records found  1 - 10nextend  jump to record: Search took 0.04 seconds. 
MSAR BTF Model
Havlíček, Michal
The Bidirectional Texture Function (BTF) is the recent most advanced representation of material surface visual properties. BTF specifies the changes of visual appearance due to varying illumination and viewing conditions. Such a function might be represented by thousands of images of surface taken in given illumination and viewing conditions per sample of the material. Resulting BTF size, hundreds of gigabytes, excludes its direct rendering in graphical applications, accordingly some compression of these data is obviously necessary. This paper presents a novel probabilistic model based algorithm for realistic multispectral BTF texture modelling. This complex but efficient method combines several multispectral band limited spatial factors and corresponding range map to produce the required BTF texture. Proposed scheme enables very high BTF texture compression ratio and in addition may be used to reconstruct BTF space i.e. non-measured parts of the BTF space.
Detection of Copies Using Mathematical Processing of Images I: Theoretical Basis and Methodology
Blažek, Jan ; Hradilová, J. ; Zitová, Barbara ; Kamenický, Jan
We propose semi-automatic methods for registration and comparison digital images of fine art. Registration algorithm is based on estimation of perspective transformation parameters according to control points. Image comparison is based on statistical analysis of images, normalization of brightness and absolute deviation. Suitability of comparing algorithms is demonstrated on examples. As an output of comparing algorithms are used intelligible differential maps.
Implementation of the algorithm MIDIA in the Google Spreadsheets
Boček, Pavel ; Vrbenský, Karel
Algorithm MIDIA was modified for the platform Google Doc, which allows its use and application through Internet sharing.
Third-degree stochastic dominance and DEA efficiency - relations and numerical comparison
Branda, Martin
We propose efficiency tests which are related to the third-degree stochastic dominance (TSD). The tests are based on necessary conditions for TSD and on related mean-risk models. We test pairwise efficiency as well as portfolio efficiency with respect to full diversification of available assets.
Multifractal Height Cross-Correlation Analysis
Krištoufek, Ladislav
We introduce a new method for detection of long-range cross- correlations and cross-multifractality – multifractal height cross-correlation analysis (MF-HXA). MF-HXA is a multivariate generalization of the height- height correlation analysis. We show that long-range cross-correlations can be caused by a mixture of the following – long-range dependence of separate processes and additional scaling of covariances between the processes. Simi- lar separation applies for cross-multifractality – standard separation between distributional properties and correlations is enriched by division of correlations between auto-correlations and cross-correlations. We further apply the method on returns and volatility of NASDAQ and S&P500 indices as well as of Crude and Heating Oil futures and uncover some interesting results.
Elliptical Stable Distributions
Omelchenko, Vadym
The elliptical stable distributions represent a symmetric subfamily of the stable distributions. Their advantage contrary to the general stable distributions consists in their easy-to-use property and the highest resemblance to the normal distribution. They enable an easy representation of the dependence structure of the margins by means of a matrix Q the same as in case of the normal distribution. In general, the dependence structure between margins is given in form of a spectral measure which can be even continuous. The computations and approximations require so much time that it just the fact that many practitioners avoid using general stable distributions. The general stable distributions possess so many additional properties that they barely take after the multivariate normal distribution. But the multi-variate elliptical stable distributions can be easily simulated and the estimation of their parameters can be obtained by methods whose preciseness is almost the same as the one of the maximum likelihood methodology.
Bayesian classification of digital images by web application
Talich, M. ; Böhm, O. ; Soukup, Lubomír
The contribution introduces web application for image classification that has been developed at the Research Institute of Geodesy, Topography and Cartography in the framework of grant project InGeoCalc (supported by Ministry of education of the Czech Republic). The web application is aimed to display, examine and classify digital image data. The data are expected to be obtained from Internet by means of Web Map Services (WMS) or from other sources (possibly non-registered). Image data from different sources can be combined and presented as composition of layers (coverage) with adjustable degrees of transparency. After gathering the data, Bayesian (supervised) classification is applied to distinguish separate regions in the image. User can choose between several classification methods and adjust pertinent parameters. Furthermore, several subsequent basic analytical tools are offered, namely computation of distances, areas or perimeters related to the classified regions, simple statistical summaries about classification results (e.g. distribution of classes, percentage of non-classified regions, etc.). The classification results and registration parameters can be saved for further use. The web application is based on common Internet standards (HTML, Javascript, SVG). The only requirement for running the application is an up-to-date Internet browser supporting SVG (Scalable Vector Graphics). Typical usage of the web application can involve land cover mapping based on satellite or aerial images. The application is available free of charge for any Internet user.
Using indicators of ecological stability in stochastic programming
Houda, Michal
When building bigger construction the EU law impose the so-called EIA process - evaluation of possible influences of the construction on the environment and population health, grouped into several categories. Outputs of the EIA process are recommendations to the investors compensating the negative impacts of the constructions by additional arrangements. In our contribution we develop an innovative approach to model the expenses devoted to obey the EIA rules by stochastic programming tools: especially, we represent uncertainty in parameters by their probabilistic distributions, and subjective utility function representing the ecological demands is modelled via so-called indicators of ecological stability. The model takes into account budget limitations, several legislative obligations, and other ecological aspects; the goal is to help choose the optimal compensating constructions and arrangements. The resulting stochastic programming model is seen as parallel to V@R problem.
Modeling multivariate volatility using wavelet-based realized covariance estimator
Baruník, Jozef ; Vácha, Lukáš
Abstract. Study of the covariation have become one of the most active and successful areas of research in the time series econometrics and economic forecasting during the recent decades. Our work brings complete theory for the realized covariation estimation generalizing current knowledge and bringing the estimation to the time-frequency domain for the first time. The results generalize the popular realized volatility framework by bringing the robustness to noise as well jumps and ability to measure the realized covariance not only in time but also in frequency domain. Noticeable contribution is brought also by the application of the presented theory. Our time-frequency estimators bring not only more efficient estimates, but decomposes the realized covariation into arbitrarily chosen investment horizons. Results thus bring better understanding of the dynamics of dependence between the stock markets.
Tools for Decision Making under Uncertainty
Sečkárová, Vladimíra
In this paper we focus on two often considered distinct aims, namely maximizing of an utility function (e.g. an investment profit) and getting a more reliable global description of considered situation based on observed data (e.g. the final outcome of databases merging). In both cases we face the problem, that the data are unreliable, since they contain uncertainty caused by their source (i.e. human being). If we are looking for the optimum of the former aim, a game theory reformulation of the decision making task brings a smoother way to reach it. If the latter aim is considered, a merging procedure (also called fusion) processing the data should help us. This paper describes four recently developed methods dealing with decision making under uncertainty in two considered directions and one tool used for comparison of the fusion algorithms.

National Repository of Grey Literature : 485 records found   1 - 10nextend  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.