National Repository of Grey Literature 49 records found  beginprevious21 - 30nextend  jump to record: Search took 0.01 seconds. 
Variants of data quality management within the regulation Solvency II
Pastrňáková, Alena ; Bína, Vladislav (advisor) ; Přibil, Jiří (referee)
The diploma thesis deals with data quality in connection with legal requirements of the Solvency II regulation, which must be met by insurance companies in order to keep their licences. The aim of this thesis is to consider opportunities and impacts of implementing data quality for Solvency II. All data quality requirements of the regulation were specified and supplemented with possibilities how to meet them. Related data quality areas were also described. Sample variants of manual, partially automated and fully automated solutions with regard to expenditure of costs and time were compared based on knowledge and acquired information. The benefit of this thesis is evaluation of possible positive and negative impacts of implementing data quality for Solvency II taking into account the possibility of introducing data quality across the entire company. General solution variants can be used for decision-making on implementing data quality in most companies out of insurance industry.
Big Data Governance
Blahová, Leontýna ; Pejčoch, David (advisor) ; Kyjonka, Vladimír (referee)
This master thesis is about Big Data Governance and about software, which is used for this purposes. Because Big Data are huge opportunity and also risk, I wanted to map products which can be easily use for Data Quality and Big Data Governance in one platform. This thesis is not only on theoretical knowledge level, but also evaluates five key products (from my point of view). I defined requirements for every kind of domain and then I set up the weights and points. The main objective is to evaluate software capabilities and compere them.
IBM Cognos Report Studio jako efektivní nástroj reportingu v oblasti lidského kapitálu
Zinchenko, Yulia ; Pour, Jan (advisor) ; Obolenskiy, Vladimir Obolenskiy (referee)
Main topic discussed in this diploma thesis is corporate reporting in terms of Human Capital using Business Intelligence tools, specifically IBM Cognos Report Studio. One of the objectives is to show step-by-step methodology of creating complex dynamic report, which includes data structure modeling, layout design and quality check. Another objective is to conduct Cost-Benefit Analysis for a real-life project, which is focused on recreating of Excel-based report in Cognos-based environment in order to automate information flows. Essential part of the diploma thesis is theoretical background of Business Intelligence aspects of data quality and visualization as well as purposes of human capital reporting and description of appropriate KPIs. Objectives are addressed by conducting analysis and research of resources related to topics described above as well as using IBM Cognos Report Studio provided by one of the major companies in financial advisory field. This diploma thesis represents relevant reading for those, who are interested in real-life application of data quality improvement and information flow automation using Business Intelligence reporting tools.
Towards Complex Data and Information Quality Management
Pejčoch, David ; Rauch, Jan (advisor) ; Máša, Petr (referee) ; Novotný, Ota (referee) ; Kordík, Pavel (referee)
This work deals with the issue of Data and Information Quality. It critically assesses the current state of knowledge within tvarious methods used for Data Quality Assessment and Data (Information) Quality improvement. It proposes new principles where this critical assessment revealed some gaps. The main idea of this work is the concept of Data and Information Quality Management across the entire universe of data. This universe represents all data sources which respective subject comes into contact with and which are used under its existing or planned processes. For all these data sources this approach considers setting the consistent set of rules, policies and principles with respect to current and potential benefits of these resources and also taking into account the potential risks of their use. An imaginary red thread that runs through the text, the importance of additional knowledge within a process of Data (Information) Quality Management. The introduction of a knowledge base oriented to support the Data (Information) Quality Management (QKB) is therefore one of the fundamental principles of the author proposed a set of best
Master Data Quality and Data Synchronization in FMCG
Tlučhoř, Tomáš ; Chlapek, Dušan (advisor) ; Kučera, Jan (referee)
This master thesis deals with a topic of master data quality at retailers and suppliers of fast moving consumer goods. The objective is to map a flow of product master data in FMCG supply chain and identify what is the cause bad quality of the data. Emphasis is placed on analyzing a listing process of new item at retailers. Global data synchronization represents one of the tools to increase efficiency of listing process and improve master data quality. Therefore another objective is to clarify the cause of low adoption of global data synchronization at Czech market. The thesis also suggests some measures leading to better master data quality in FMCG and expansion of global data synchronization in Czech Republic. The thesis consists of theoretical and practical part. Theoretical part defines several terms and explores supply chain operation and communication. It also covers theory of data quality and its governance. Practical part is focused on objectives of the thesis. Accomplishment of those objectives is based on results of a survey among FMCG suppliers and retailers in Czech Republic. The thesis contributes to enrichment of academic literature that does not focus on master data quality in FMCG and global data synchronization very much at the moment. Retailers and suppliers of FMCG can use the results of the thesis as an inspiration to improve the quality of their master data. A few methods of achieving better data quality are introduced. The thesis has been assigned by non-profit organization GS1 Czech Republic that can use the results as one of the supporting materials for development of next global data synchronization strategy.
Data Quality Tools Benchmark
Černý, Jan ; Pejčoch, David (advisor) ; Máša, Petr (referee)
Companies all around the world are wasting their funds due to the poor data quality. Rationally speaking as the volume of processed data increase, the volume of error data increase too. This diploma thesis explains what is it data quality about, what are the causes of data quality errors, the impact of poor data and the way it can be measured. If you can measure it, you can improve it. This is where data quality tools are used. There are vendors that offer commercial solutions and there are also vendors that offer open-source solutions of data quality tools. Comparing DataCleaner (open-source tool) with DataFlux (commercial tool) using defined criteria this diploma thesis proves that those two tools could be equal in terms of data profiling, data enhancement and data monitoring. DataFlux is slightly better in standardization and data validation. Data deduplication is not included in tested version of DataCleaner, although DataCleaner's vendor claimed it should be. One of the biggest obstacles why companies don't buy data quality tools could be its price. At this moment, it is possible to consider DataCleaner as an inexpensive solution for companies looking for data profiling tool. If Human Inference added data deduplication to DataCleaner, it could be also possible to consider it as an inexpensive solution covers whole data quality process.
An analysis and implementation of Dashboards within SAP Business Objects 4.0/4.1
Kratochvíl, Tomáš ; Pour, Jan (advisor) ; Šedivá, Zuzana (referee)
The diploma thesis is focused on dashboards analysis and distribution and theirs implementation afterwards in SAP Dashboards and Web Intelligence tools. The main goal of this thesis is an analysis of dashboards for different area of company management according to chosen of architecture solution. Another goal of diploma thesis is to take into account the principles of dashboards within the company and it deals with indicator comparison as well. The author further defines data life cycle within Business Intelligence and deals with the decomposition of particular dashboard types in theoretical part. At the end of theory, it is included an important chapter from point of view data quality, data quality process and data quality improvement and an using of SAP Best Practices and KBA as well for BI tools published by SAP. The implementation of dashboards should be back up theoretical part. Implementation is divided into 3 chapters according to selected architecture, using multisource systems, SAP Infosets/Query and using Data Warehouse or Data Mart as an architecture solution for reporting purposes. The deep implementing section should be help reader to make his own opinion to different architecture, but especially difference in used BI tools within SAP Business Objects. At the end of each section regarding architecture and its solution, there are defined pros and cons.
Impact of the process and data integration on reporting efficiency
Sys, Bohuslav ; Šebesta, Michal (advisor) ; Bruckner, Tomáš (referee)
Nowadays, when the difference between failure and success is amount of the available information combined with exponential growth of the available information on web leads to rising need to track the quality of the data. This worldwide trend is not only global, but it affects even individuals and companies in particular. In comparison with the past these companies produce higher amount of data, which are more complex at the same time, all to get a better idea about the real world. This leads us to the main problem, when we not only need to gather the data, but we have to present them in such way, so they can serve the purpose for which they have been gathered. Therefore the purpose of this thesis is to focus on processes following the data gathering -- data quality and transformation processes. In the first part of the thesis we will define a basic concept and issues, followed by methods necessary for acquiring requested data in expected quality, which includes the required infrastructure. In the second part of the thesis we will define real-life example and use the knowledge from previous part to design usable solution and deploy it into use. In conclusion we will evaluate the design compared to the result acquired from its real-life utilization.
Data quality on the context of open and linked data
Tomčová, Lucie ; Chlapek, Dušan (advisor) ; Kučera, Jan (referee)
The master thesis deals with data quality in the context of open and linked data. One of the goals is to define specifics of data quality in this context. The specifics are perceived mainly with orientation to data quality dimensions (i. e. data characteristics which we study in data quality) and possibilities of their measurement. The thesis also defines the effect on data quality that is connected with data transformation to linked data; the effect if defined with consideration to possible risks and benefits that can influence data quality. The list of metrics verified on real data (open linked data published by government institution) is composed for the data quality dimensions that are considered to be relevant in context of open and linked data. The thesis points to the need of recognition of differences that are specific in this context when assessing and managing data quality. At the same time, it offers possibilities for further study of this question and it presents subsequent directions for both theoretical and practical evolution of the topic.
MDM produktovych dat (MDM of Product Data)
Čvančarová, Lenka ; Pour, Jan (advisor) ; Holes, David (referee)
This thesis is focused on Master Data Management of Product Data. At present, most publications on the topic of MDM take into account customer data, and a very limited number of sources focus solely on product data. Some resources actually do attempt to cover MDM in full-depth. Even those publications are typically are very customer oriented. The lack of Product MDM oriented literature became one of the motivations for this thesis. Another motivation was to outline and analyze specifics of Product MDM in context of its implementation and software requirements for a vendor of MDM application software. For this I chose to create and describe a methodology for implementing MDM of product data. The methodology was derived from personal experience on projects focused on MDM of customer data, which was applied on findings from the theoretical part of this thesis. By analyzing product data characteristics and their impacts on MDM implementation as well as their requirements for application software, this thesis helps vendors of Customer MDM to understand the challenges of Product MDM and therefore to embark onto the product data MDM domain. Moreover this thesis can also serve as an information resource for enterprises considering adopting MDM of product data into their infrastructure.

National Repository of Grey Literature : 49 records found   beginprevious21 - 30nextend  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.