Národní úložiště šedé literatury Nalezeno 19,001 záznamů.  začátekpředchozí21 - 30dalšíkonec  přejít na záznam: Hledání trvalo 1.71 vteřin. 

Nájem bytu manželi a užívání družstevního bytu manželi v nové úpravě po 1.1.2014
Prantlová, Soňa ; Kadlecová, Eva (vedoucí práce) ; Pavla, Pavla (oponent)
Diplomová práce se věnovala tématu nájmu bytu manželi a jeho užívání tak, jak je to zakotveno v nové zákonné úpravě občanského zákoníku č. 89/2012 Sb. Ten nahradil do té doby fungující občanský zákoník z roku 1964. V nové právní úpravě je zakotvena řada nových institutů, jejichž cílem je především ochránit slabší stranu, v tomto případě nájemce. Diplomová práce byla rozčleněna na teoretickou a praktickou část. V teoretické části byla věnována pozornost základním pojmům, které zde byly definovány. Byla zde charakterizována práva nájemce a pronajímatele. Byla rozebrána právní úprava bydlení dle nového občanského zákoníku. Praktická část se věnovala interpretaci výsledků dotazníkového šetření. Byli osloveni nájemci několika bytových domů ve městě Kralupy nad Vltavou. Na základě dosažených zjištění byla navržena některá doporučení pro zvýšení informovanosti o právech a povinnostech nájemců, jakož i o celé problematice bydlení z právního hlediska.

Navrhování experimentů pro nestacionární produkční procesy
Jadrná, Monika ; Macák, Tomáš (vedoucí práce)
Disertační práce se zaměřuje na oblast služeb a oblast hromadné výroby. Konkrétně se jedná o optimalizaci produktového portfolia cestovní kanceláře a optimalizaci výroby nábojů. V literárních východiscích je vysvětlena terminologie z oblasti rozhodování a popsány metody, které jsou používány pro podporu rozhodování. Jedná se o aktuální přehled řešené problematiky a definování základních pojmů. Teoretická východiska výzkumu jsou v oblasti služeb zaměřena na volbu vhodných vstupních proměnných. V oblasti výroby pak na volbu konkrétního materiálu a vhodného vybavení pro danou výrobu. Literární východiska a teoretická východiska výzkumu současně tvoří základ pro praktickou část práce. V praktické části disertační práce je zvolen konkrétní podnik působící v daném odvětví. V oblasti služeb je optimalizováno produktové portfolio pomocí Fuzzy logiky a Fuzzy množin, tak aby firma působící v dané oblasti byla schopna se uplatnit a fungovat na současném vysoce konkurenčním trhu. V oblasti výroby je nastaveno optimální složení produktu tak, aby bylo dosahováno jeho požadovaných vlastností. Hlavním cílem disertační práce je návrh metodického přístupu pro řízení vybraných podnikových procesů při jejich nestacionárním časovém průběhu. V praktické realizaci je cílem verifikovat funkčnost navrženého metodického přístupu, jak v oblasti služeb, tak v oblasti hromadné výroby.

Komunitní péče v psychiatrii
BÍNOVÁ, Romana
Abstrakt Diplomová práce se zabývá problematikou komunitní péče v psychiatrii a snaží se zachytit, jaký význam při jejím poskytování má sestra. Komunitní psychiatrická péče je velmi širokou oblastí intermediální pomoci pacientovy, která se dotýká v podstatě jakékoliv oblasti jeho života. Přestože v České republice nedosáhla patřičného rozvoje, její přínos pro nemocné je již nyní důležitý a na její význam neustále roste. V teoretické části, po krátkém uvedení do problematiky komunitní péče je rozebrána její historie, principy, ale také její propojení s ošetřovatelskou péčí. Následně je věnována pozornost jednotlivým oblastem komunitní péče, které mají význam pro duševně nemocné a je popsána úloha sestry v této oblasti. Následuje zmínka o přístupu k duševně nemocným, problematice stigmatizace a organizace psychiatrické péče. V neposlední řadě jsou v teoretické části práce rozebrány psychiatrická onemocnění, která se mohou v komunitní péči vyskytnout a je uvedeno, jakým způsobem může být komunitní péče u těchto onemocnění přínosem.V praktické části bylo cílem zjistit, jaké je povědomí psychiatrických sester o komunitní péči a jaký význam přikládají komunitní péči v psychiatrii. Snahou bylo i zmapovat oblasti komunitní péče, ve kterých mohou pracovat sestry. V rámci kvantitativního šetření byly statisticky zpracovány odpovědi sester na zkoumané hypotézy. Hypotézy zjišťovaly, zda sestry s délkou praxe delší než deset let se častěji domnívají, že psychiatrické péče je pro pacienty přínosnější než hospitalizace. Dále, zda mají sestry s vyšším než středoškolským vzděláním lepší povědomí o poskytování komunitních služeb. Hypotézy také zjišťovaly, zda sestry považují za nejčastější formu komunitní péče v České republice služby zaměřené na podporu bydlení a jestli si sestry starší třiceti let více uvědomují důležitost postavení sester v komunitní péči. Žádnou z těchto hypotéz se nepodařilo prokázat. Hypotéza, která předpokládala, že sestry shánějí informace o komunitní péči více z internetu a literatury, než ze seminářů se potvrdila. Šetření se zúčastnily psychiatrické sestry pracující v psychiatrických léčebnách v Jihomoravském kraji a kraji Vysočina. Součástí praktické části je i analýza komunitních služeb v daných krajích. V obou krajích bylo dohromady zmapováno 13 občanských sdružení a komunitních center.

Automata in Infinite-state Formal Verification
Lengál, Ondřej ; Jančar, Petr (oponent) ; Veith, Helmut (oponent) ; Esparza, Javier (oponent) ; Vojnar, Tomáš (vedoucí práce)
The work presented in this thesis focuses on finite state automata over finite words and finite trees, and the use of such automata in formal verification of infinite-state systems. First, we focus on extensions of a previously introduced framework for verifi cation of heap-manipulating programs-in particular programs with complex dynamic data structures-based on tree automata. We propose several extensions to the framework, such as making it fully automated or extending it to consider ordering over data values. Further, we also propose novel decision procedures for two logics that are often used in formal verification: separation logic and weak monadic second order logic of one successor. These decision procedures are based on a translation of the problem into the domain of automata and subsequent manipulation in the target domain. Finally, we have also developed new approaches for efficient manipulation with tree automata, mainly for testing language inclusion and for handling automata with large alphabets, and implemented them in a library for general use. The developed algorithms are used as the key technology to make the above mentioned techniques feasible in practice.

New Methods for Increasing Efficiency and Speed of Functional Verification
Zachariášová, Marcela ; Dohnal, Jan (oponent) ; Steininger, Andreas (oponent) ; Kotásek, Zdeněk (vedoucí práce)
In the development of current hardware systems, e.g. embedded systems or computer hardware, new ways how to increase their reliability are highly investigated. One way how to tackle the issue of reliability is to increase the efficiency and the speed of verification processes that are performed in the early phases of the design cycle. In this Ph.D. thesis, the attention is focused on the verification approach called functional verification. Several challenges and problems connected with the efficiency and the speed of functional verification are identified and reflected in the goals of the Ph.D. thesis. The first goal focuses on the reduction of the simulation runtime when verifying complex hardware systems. The reason is that the simulation of inherently parallel hardware systems is very slow in comparison to the speed of real hardware. The optimization technique is proposed that moves the verified system into the FPGA acceleration board while the rest of the verification environment runs in simulation. By this single move, the simulation overhead can be significantly reduced. The second goal deals with manually written verification environments which represent a huge bottleneck in the verification productivity. However, it is not reasonable, because almost all verification environments have the same structure as they utilize libraries of basic components from the standard verification methodologies. They are only adjusted to the system that is verified. Therefore, the second optimization technique takes the high-level specification of the system and then automatically generates a comprehensive verification environment for this system. The third goal elaborates how the completeness of the verification process can be achieved using the intelligent automation. The completeness is measured by different coverage metrics and the verification is usually ended when a satisfying level of coverage is achieved. Therefore, the third optimization technique drives generation of input stimuli in order to activate multiple coverage points in the veri\-fied system and to enhance the overall coverage rate. As the main optimization tool the genetic algorithm is used, which is adopted for the functional verification purposes and its parameters are well-tuned for this domain. It is running in the background of the verification process, it analyses the coverage and it dynamically changes constraints of the stimuli generator. Constraints are represented by the probabilities using which particular values from the input domain are selected.       The fourth goal discusses the re-usability of verification stimuli for regression testing and how these stimuli can be further optimized in order to speed-up the testing. It is quite common in verification that until a satisfying level of coverage is achieved, many redundant stimuli are evaluated as they are produced by pseudo-random generators. However, when creating optimal regression suites, redundancy is not needed anymore and can be removed. At the same time, it is important to retain the same level of coverage in order to check all the key properties of the system. The fourth optimization technique is also based on the genetic algorithm, but it is not integrated into the verification process but works offline after the verification is ended. It removes the redundancy from the original suite of stimuli very fast and effectively so the resulting verification runtime of the regression suite is significantly improved.

Packet Classification Algorithms
Puš, Viktor ; Lhotka,, Ladislav (oponent) ; Dvořák, Václav (vedoucí práce)
This thesis deals with packet classification in computer networks. Classification is the key task in many networking devices, most notably packet filters - firewalls. This thesis therefore concerns the area of computer security. The thesis is focused on high-speed networks with the bandwidth of 100 Gb/s and beyond. General-purpose processors can not be used in such cases, because their performance is not sufficient. Therefore, specialized hardware is used, mainly ASICs and FPGAs. Many packet classification algorithms designed for hardware implementation were presented, yet these approaches are not ready for very high-speed networks. This thesis addresses the design of new high-speed packet classification algorithms, targeted for the implementation in dedicated hardware. The algorithm that decomposes the problem into several easier sub-problems is proposed. The first subproblem is the longest prefix match (LPM) operation, which is used also in IP packet routing. As the LPM algorithms with sufficient speed have already been published, they can be used in out context. The following subproblem is mapping the prefixes to the rule numbers. This is where the thesis brings innovation by using a specifically constructed hash function. This hash function allows the mapping to be done in constant time and requires only one memory with narrow data bus. The algorithm throughput can be determined analytically and is independent on the number of rules or the network traffic characteristics. With the use of available parts the throughput of 266 million packets per second can be achieved. Additional three algorithms (PFCA, PCCA, MSPCCA) that follow in this thesis are designed to lower the memory requirements of the first one without compromising the speed. The second algorithm lowers the memory size by 11 % to 96 %, depending on the rule set. The disadvantage of low stability is removed by the third algorithm, which reduces the memory requirements by 31 % to 84 %, compared to the first one. The fourth algorithm combines the third one with the older approach and thanks to the use of several techniques lowers the memory requirements by 73 % to 99 %.

Harnessing Forest Automata for Verification of Heap Manipulating Programs
Šimáček, Jiří ; Abdulla, Parosh (oponent) ; Křetínský, Mojmír (oponent) ; Vojnar, Tomáš (vedoucí práce)
This work addresses verification of infinite-state systems, more specifically, verification of programs manipulating complex dynamic linked data structures. Many different approaches emerged to date, but none of them provides a~sufficiently robust solution which would succeed in all possible scenarios appearing in practice. Therefore, in this work, we propose a new approach which aims at improving the current state of the art in several dimensions. Our approach is based on using tree automata, but it is also partially inspired by some ideas taken from the methods based on separation logic. Apart from that, we also present multiple advancements within the implementation of various tree automata operations, crucial for our verification method to succeed in practice. Namely, we provide an optimised algorithm for computing simulations over labelled transition systems which then translates into more efficient computation of simulations over tree automata. We also give a new algorithm for checking inclusion over tree automata, and we provide experimental evaluation demonstrating

Optimization of Gaussian Mixture Subspace Models and Related Scoring Algorithms in Speaker Verification
Glembek, Ondřej ; Brummer, Niko (oponent) ; Campbell,, William (oponent) ; Burget, Lukáš (vedoucí práce)
This thesis deals with Gaussian Mixture Subspace Modeling in automatic speaker recognition. The thesis consists of three parts.  In the first part, Joint Factor Analysis (JFA) scoring methods are studied.  The methods differ mainly in how they deal with the channel of the tested utterance.  The general JFA likelihood function is investigated and the methods are compared both in terms of accuracy and speed.  It was found that linear approximation of the log-likelihood function gives comparable results to the full log-likelihood evaluation while simplyfing the formula and dramatically reducing the computation speed. In the second part, i-vector extraction is studied and two simplification methods are proposed. The motivation for this part was to allow for using the state-of-the-art technique on small scale devices and to setup a simple discriminative-training system.  It is shown that, for long utterances, while sacrificing the accuracy, we can get very fast and compact i-vector systems. On a short-utterance(5-second) task, the results of the simplified systems are comparable to the full i-vector extraction. The third part deals with discriminative training in automatic speaker recognition.  Previous work in the field is summarized and---based on the knowledge from the earlier chapters of this work---discriminative training of the i-vector extractor parameters is proposed.  It is shown that discriminative re-training of the i-vector extractor can improve the system if the initial estimation is computed using the generative approach.

Network-wide Security Analysis
de Silva, Hidda Marakkala Gayan Ruchika ; Šafařík,, Jiří (oponent) ; Šlapal, Josef (oponent) ; Švéda, Miroslav (vedoucí práce)
The objective of the research is to model and analyze the effects of dynamic routing protocols. The thesis addresses the analysis of service reachability, configurations, routing and security filters on dynamic networks in the event of device or link failures. The research contains two main sections, namely, modeling and analysis. First section consists of modeling of network topology, protocol behaviors, device configurations and filters. In the modeling, graph algorithms, routing redistribution theory, relational algebra and temporal logics were used. For the analysis of reachability, a modified topology table was introduced. This is a unique centralized table for a given network and invariant for network states. For the analysis of configurations, a constraint-based analysis was developed by using XSD Prolog. Routing and redistribution were analyzed by using routing information bases and for analyzing the filtering rules, a SAT-based decision procedure was incorporated. A part of the analysis was integrated to a simulation tool at OMNeT++ environment. There are several innovations introduced in this thesis. Filtering network graph, modified topology table, general state to reduce the state space, modeling devices as filtering nodes and constraint-based analysis are the key innovations. Abstract network graph, forwarding device model and redistribution with routing information are extensions of the existing research. Finally, it can be concluded that this thesis discusses novel approaches, modeling methods and analysis techniques in the area of dynamic networks. Integration of these methods into a simulation tool will be a very demanding product for the network designers and the administrators.

Relational Verification of Programs with Integer Data
Konečný, Filip ; Bouajjani, Ahmed (oponent) ; Jančar, Petr (oponent) ; Vojnar, Tomáš (vedoucí práce)
This work presents novel methods for verification of reachability and termination properties of programs that manipulate unbounded integer data. Most of these methods are based on acceleration techniques which compute transitive closures of program loops. We first present an algorithm that accelerates several classes of integer relations and show that the new method performs up to four orders of magnitude better than the previous ones. On the theoretical side, our framework provides a common solution to the acceleration problem by proving that the considered classes of relations are periodic. Subsequently, we introduce a semi-algorithmic reachability analysis technique that tracks relations between variables of integer programs and applies the proposed acceleration algorithm to compute summaries of procedures in a modular way. Next, we present an alternative approach to reachability analysis that integrates predicate abstraction with our acceleration techniques to increase the likelihood of convergence of the algorithm. We evaluate these algorithms and show that they can handle a number of complex integer programs where previous approaches failed. Finally, we study the termination problem for several classes of program loops and show that it is decidable. Moreover, for some of these classes, we design a polynomial time algorithm that computes the exact set of program configurations from which nonterminating runs exist. We further integrate this algorithm into a semi-algorithmic method that analyzes termination of integer programs, and show that the resulting technique can verify termination properties of several non-trivial integer programs.