National Repository of Grey Literature 9,676 records found  1 - 10nextend  jump to record: Search took 0.99 seconds. 

Contribution to Problems of Zero-Sequence Component in Electrical Machines
Schreier, Luděk ; Chomát, Miroslav
For their reliability and relatively low production costs, three-phase induction machines are widely utilized in variable speed drives as drive units. The stator is fed from a solid-state frequency converter. At operation of these drive systems however, some part of feeding converter can fail, e.g. one of its legs.In order that the drive can be kept in operation in some cases, the converter is reconfigured.

Development of POS system application with support of EET for retail trade
Nguyen Manh, Tho ; Pavlíčková, Jarmila (advisor) ; Bruckner, Tomáš (referee)
The goal of this thesis is to design and develop an application software for the POS system. The main criterion of the developed system is to support online registration of sales, which meets the requirements stated by the Act no. 112/2016 Coll., on Registration of Sales. In this thesis the POS system is designated to support processes of retail trade. The theoretical part focuses on the description of the POS system and its components. To solve the issue of online registration of sales, an analysis of the legislation and the market with the POS systems was made. The thesis also presents an analysis of the processes and requirements, a solution design and the implementation of applications based on Java. To test the functionality of the developed application a test case was used. The final chapter summarizes the content of the thesis and outlines possible future developments. The output of this paper is a functional POS application, which will be available for download under open-source license. This paper could also serve as a business guide to understand the topic of POS systems and online registration of sales.

Řízení IS/ICT se zaměřením na sourcing služeb informačního systému
Šebesta, Michal ; Voříšek, Jiří (advisor) ; Havlíček, Zdeněk (referee) ; Příklenk, Oldřich (referee) ; Král, Jaroslav (referee)
Research on outsourcing has been around for several decades, while recent evolution in the information systems discipline towards ICT service commoditization significantly changes the context of decision-making. Services that are available on-demand via the Internet allow organizations implementing functions they demand in a fraction of time. This trend represents a chance for organizations seeking to use advanced ICT services without a need of major investments. Problem is the current lack of guidelines and tools for managing ICT services and their outsourcing. Given the trends on the ICT service market, it is expected that much of the IT management in the future will encompass the ICT services and utilize service-level structures. Methods currently available are either too broad or encompass only small part of the whole problem. Ad-hoc or unsound decisions in this area might cause major complications in terms of quality, usability, integration, and consequently influence total cost of organizational IT. Organizations need to either revise existing models or propose and implement completely new models to manage their IS/ICT. This thesis deals with the management of IS/ICT with focus on the ICT services outsourcing. It discusses available sourcing models in the literature and links them to the various interconnected areas. Based on these areas, it presents an integrated view on IT outsourcing strategies. Most importantly the thesis proposes an original concept for decision-making about outsourcing of ICT services named the SOURCER framework. This approach utilizes the presented outsourcing strategies, and introduces a complex methodology and decision-making criteria that will assist organizations with selection of ICT services in order to maintain and manage a most suitable ICT service portfolio. The decision-making is based on four essential viewpoints: function, costs, time, and quality. These viewpoints are discussed, individually analyzed, and serve as a basis for further research. The whole framework is developed and validated according to Design Science Research Methodology (DSRM). Individual components are evaluated using a survey among a group of selected IT managers. Proof of concept is then established by a case study on framework use in a real organization. This case study covers strategy specification, business--IT alignment, specifying service architecture and its interconnections, outsourcing, and management of the ICT service portfolio.

Clustering and regression analysis of micro panel data
Sobíšek, Lukáš ; Pecáková, Iva (advisor) ; Komárek, Arnošt (referee) ; Brabec, Marek (referee)
The main purpose of panel studies is to analyze changes in values of studied variables over time. In micro panel research, a large number of elements are periodically observed within the relatively short time period of just a few years. Moreover, the number of repeated measurements is small. This dissertation deals with contemporary approaches to the regression and the clustering analysis of micro panel data. One of the approaches to the micro panel analysis is to use multivariate statistical models originally designed for crosssectional data and modify them in order to take into account the within-subject correlation. The thesis summarizes available tools for the regression analysis of micro panel data. The known and currently used linear mixed effects models for a normally distributed dependent variable are recapitulated. Besides that, new approaches for analysis of a response variable with other than normal distribution are presented. These approaches include the generalized marginal linear model, the generalized linear mixed effects model and the Bayesian modelling approach. In addition to describing the aforementioned models, the paper also includes a brief overview of their implementation in the R software. The difficulty with the regression models adjusted for micro panel data is the ambiguity of their parameters estimation. This thesis proposes a way to improve the estimations through the cluster analysis. For this reason, the thesis also contains a description of methods of the cluster analysis of micro panel data. Because supply of the methods is limited, the main goal of this paper is to devise its own two-step approach for clustering micro panel data. In the first step, the panel data are transformed into a static form using a set of proposed characteristics of dynamics. These characteristics represent different features of time course of the observed variables. In the second step, the elements are clustered by conventional spatial clustering techniques (agglomerative clustering and the C-means partitioning). The clustering is based on a dissimilarity matrix of the values of clustering variables calculated in the first step. Another goal of this paper is to find out whether the suggested procedure leads to an improvement in quality of the regression models for this type of data. By means of a simulation study, the procedure drafted herein is compared to the procedure applied in the kml package of the R software, as well as to the clustering characteristics proposed by Urso (2004). The simulation study demonstrated better results of the proposed combination of clustering variables as compared to the other combinations currently used. A corresponding script written in the R-language represents another benefit of this paper. It is available on the attached CD and it can be used for analyses of readers own micro panel data.

Usage of unstructured data in Business Intelligence
Rakhmanova, Malika ; Šperková, Lucie (advisor) ; Karkošková, Soňa (referee)
The aim of the thesis is to identify the main trends that are occurring in the market of Business Intelligence and related to unstructured data, to describe the possibilities for integrating unstructured data, to clarify what the impact on the company have the results that can be obtained using these solutions and how generally incorporate an analysis of unstructured data into BI. Another aim is to show the current situation of processing unstructured data on the example of BI system. The thesis is divided into several parts. First part is describing of the Business Intelligence area and the basic components of Business Intelligence, as well as identifying market trends. Then, there is the next part: separating the data into structured and unstructured. Here is the part about how you can access and analyse unstructured data and what is their place in BI systems. This is the end of a block of unstructured data and the beginning of a description of the enhanced version of BI. Finally, the current market situation and BI tools, which include unstructured data, are introduced. This section provides an overview of how BI tools approach to analyse unstructured data. Existed literature, professional and freely available Internet resources are used for writing the work. The purpose is to serve as a source of information for quickly orienting in the current situation, to serve as a guide to the world of BI solutions and to show potential users what are the options and functionality of these BI solutions.

Design of marketing strategy in the company Imperium Finance
Petrovič, Marko ; Čermák, Radim (advisor) ; Sova, Martin (referee)
Internet marketing is currently one of the most important components of marketing. However, Internet marketing as a whole is much more important than mere webdesign, mere search engine optimization (SEO) or a simple PPC advertising. Most companies have recently made it throught with a website and that was all their presentation on the Internet. However, the internet is one of the fastest growing technology today and the number of its users is growing every year. Constantly there are new technologies that allow us to have a better user experience and also bring us new ways of presentation and promotion. Most of the companies make a mistake early on and underestimate the preparation, don't know their goals, customers or competitors. Internet marketing is the process and the contents of this process is the recognition, analysis, design, implementation, and ultimately verify that you achieve the goals of the organization. In the theoretical part it is defined internet marketing and its advantages over offline marketing. Briefly sum up the history and major milestones of Internet marketing that shaped its present form. Further, it's described current trends in Internet marketing and detail expands upon the selected Internet marketing tools and tools for evaluating the success of campaigns. The practical part deals with the proposal of marketing strategy in the company Imperium Finance. The main objective is to analyze the requirements of the company Imperium Finance, from analysis to draw specific conclusions and devise a marketing strategy that will support the vision and strategy of the company in the following period. All steps were carried out within the company Imperium Finance on real data from active clients and tested in action. The result is a proposal of a marketing strategy that will drive Imperium Finance, and a marketing plan for the company. The biggest benefit of this work is so real use theoretical knowledge in practice.

Didaktic Analysis of civis textbooks (Media Education)
Novotná, Jana ; Jirásková, Věra (advisor) ; Dvořáková, Michaela (referee)
The thesis is divided into two parts: theoretical and empirical part. In the theoretical part I deal with textbooks, their didactic analysis and the ways the textbooks can be used both by teachers as well as by students. My main concern was the problem of Media Education. In empirical part I focused on a didactic analysis of textbooks used for teaching of such educational subjects as Education of a Citizenship, Civics, and Social Science in which I was primarily concerned with the selected components of the didactic analysis.

Properties of Aerosol, Produced by Laser Ablation of Standard Materials for ICP-MS Analysis.
Holá, M. ; Nováková, H. ; Ondráček, Jakub ; Vojtíšek, M. ; Kanický, V.
Laser ablation (LA), together with inductively coupled plasma mass spectrometry (ICP-MS) as a detection system, has become a routine method for the direct analysis of various solid samples. The product of laser ablation contains a mixture of vapour, droplets and solid particles. All components are finally transported to a plasma by a carrier gas as a dry aerosol including mainly agglomerates of primary nanoparticles. In general, characterisation of aerosols by their particle size distribution (PSD) represents indispensable tool for fundamental studies of the interaction of laser radiation with various materials. The particle size distribution of dry aerosol originating from laser ablation of standard material was monitored by two aerosol spectrometers – Fast Mobility Particle Sizer (EEPS) and Scanning Mobility Particle Sizer (SMPS) simultaneously with laser ablation - ICP-MS analysis.\n
Fulltext: content.csg - Download fulltextPDF
Plný tet: SKMBT_C22016102412241 - Download fulltextPDF

New Methods for Increasing Efficiency and Speed of Functional Verification
Zachariášová, Marcela ; Dohnal, Jan (referee) ; Steininger, Andreas (referee) ; Kotásek, Zdeněk (advisor)
Při vývoji současných číslicových systémů, např. vestavěných systému a počítačového hardware, je nutné hledat postupy, jak zvýšit jejich spolehlivost. Jednou z možností je zvyšování efektivity a rychlosti verifikačních procesů, které se provádějí v raných fázích návrhu. V této dizertační práci se pozornost věnuje verifikačnímu přístupu s názvem funkční verifikace. Je identifikováno několik výzev a problému týkajících se efektivity a rychlosti funkční verifikace a ty jsou následně řešeny v cílech dizertační práce. První cíl se zaměřuje na redukci simulačního času v průběhu verifikace komplexních systémů. Důvodem je, že simulace inherentně paralelního hardwarového systému trvá velmi dlouho v porovnání s během v skutečném hardware. Je proto navrhnuta optimalizační technika, která umisťuje verifikovaný systém do FPGA akcelerátoru, zatím co část verifikačního prostředí stále běží v simulaci. Tímto přemístěním je možné výrazně zredukovat simulační režii. Druhý cíl se zabývá ručně připravovanými verifikačními prostředími, která představují výrazné omezení ve verifikační produktivitě. Tato režie však není nutná, protože většina verifikačních prostředí má velice podobnou strukturu, jelikož využívají komponenty standardních verifikačních metodik. Tyto komponenty se jen upravují s ohledem na verifikovaný systém. Proto druhá optimalizační technika analyzuje popis systému na vyšší úrovni abstrakce a automatizuje tvorbu verifikačních prostředí tím, že je automaticky generuje z tohoto vysoko-úrovňového popisu. Třetí cíl zkoumá, jak je možné docílit úplnost verifikace pomocí inteligentní automatizace. Úplnost verifikace se typicky měří pomocí různých metrik pokrytí a verifikace je ukončena, když je dosažena právě vysoká úroveň pokrytí. Proto je navržena třetí optimalizační technika, která řídí generování vstupů pro verifikovaný systém tak, aby tyto vstupy aktivovali současně co nejvíc bodů pokrytí a aby byla rychlost konvergence k maximálnímu pokrytí co nejvyšší. Jako hlavní optimalizační prostředek se používá genetický algoritmus, který je přizpůsoben pro funkční verifikaci a jeho parametry jsou vyladěny pro tuto doménu. Běží na pozadí verifikačního procesu, analyzuje dosažené pokrytí a na základě toho dynamicky upravuje omezující podmínky pro generátor vstupů. Tyto podmínky jsou reprezentovány pravděpodobnostmi, které určují výběr vhodných hodnot ze vstupní domény. Čtvrtý cíl diskutuje, zda je možné znovu použít vstupy z funkční verifikace pro účely regresního testování a optimalizovat je tak, aby byla rychlost testování co nejvyšší. Ve funkční verifikaci je totiž běžné, že vstupy jsou značně redundantní, jelikož jsou produkovány generátorem. Pro regresní testy ale tato redundance není potřebná a proto může být eliminována. Zároveň je ale nutné dbát na to, aby úroveň pokrytí dosáhnutá optimalizovanou sadou byla stejná, jako u té původní. Čtvrtá optimalizační technika toto reflektuje a opět používá genetický algoritmus jako optimalizační prostředek. Tentokrát ale není integrován do procesu verifikace, ale je použit až po její ukončení. Velmi rychle odstraňuje redundanci z původní sady vstupů a výsledná doba simulace je tak značně optimalizována.

Digital circuits test optimization by multifunctional components
Stareček, Lukáš ; Gramatová, Elena (referee) ; Kubátová, Hana (referee) ; Kotásek, Zdeněk (advisor)
This thesis deals with the possibilities of digital circuit test optimization using multifunctional logic gates. The most important part of this thesis is the explanation of the optimization principle, which is also described by a formal mathematical apparatus. Based on this apparatus, the work presents several options. The optimization of testability analogous to inserting test points and  simple methodology based on SCOAP is shown. The focus of work is a methodology created to optimize circuit tests. It was implemented in the form of software tools. Presented in this work are the results of using these tools to reduce the test vectors volume while maintaining fault coverage on various circuits, including circuits from the ISCAS 85 test set. Part of the work is devoted to the various principles and technology of creating multifunctional logic gates. Some selected gates of these technologies are subject to simulations of electronic properties in SPICE. Based on the principles of presented methodology and results of multifunctional gates simulations, analysis of various problems such as validity of the modified circuit test and the suitability of each multifunctional gate technology for the methodology was also made. The results of analysis and experiments confirm it is possible for the multifunctional logic gate to optimize circuit diagnostic properties in such a way that has achieved the required circuit test parameter modification with minimum impact on the quality and credibility of these tests.