National Repository of Grey Literature 4 records found  Search took 0.00 seconds. 
Multi-layered neural networks and visualization of their structure
Drobný, Michal ; Mrázová, Iveta (advisor) ; Kukačka, Marek (referee)
The model of multi-layered neural networks of the back-propagation type is well-known for their universal approximation capability and even the standard back-propagation training algorithm used for their adjustment often provides results applicable to real-world problems. The present study deals with the issue of the multi-layered neural networks. It describes selected variants of training algorithms, mainly the standard back-propagation training algorithm and the scaled conjugate gradients algorithm, which ranks among the extremely fast second-order algorithms. One of the parts of the present study is also an application for the visualisation of the structure of multi-layered neural networks whose solution is designed with respect to its potential utilization in the education of artificial intelligence. The first part of the study introduces the subject matter and formally describes both algorithms, followed by a short description of other variants of the algorithms and their analysis. The next part discusses the selection of the appropriate programming language for the implementation of the application, specifies the goals and describes the implementation works. The conclusion summarizes the test results of the speed and implementation comparison with the selected noncommercial-based software ENCOG.
Knowledge Extraction with BP-netwoks
Reitermanová, Zuzana
Title: Knowledge Extraction with BP-networks Author: Zuzana Reitermanová Department: Katedra softwarového inženýrství Supervisor: Doc. RNDr. Iveta Mrázová, CSc. Supervisor's e-mail address: mrazova@ksi.ms.mff.cuni.cz Abstract: Multi-layered neural networks of the back-propagation type are well known for their universal approximation capability. Already the stan- dard back-propagation training algorithm used for their adjustment provides often applicable results. However, efficient solutions to complex tasks cur- rently dealt with require a quick convergence and a transparent network structure. This supports both an improved generalization capability of the formed networks and an easier interpretation of their function later on. Var- ious techniques used to optimize the structure of the networks like learning with hints; pruning and sensitivity analysis are expected to impact a bet- ter generalization, too. One of the fast learning algorithms is the conjugate gradient method. In this thesis, we discuss, test and analyze the above-mentioned methods. Then, we derive a new technique combining together the advantages of them. The proposed algorithm is based on the rapid scaled conjugate gradient tech- nique. This classical method is enhanced with the enforcement of a transpar- ent internal knowledge...
Multi-layered neural networks and visualization of their structure
Drobný, Michal ; Mrázová, Iveta (advisor) ; Kukačka, Marek (referee)
The model of multi-layered neural networks of the back-propagation type is well-known for their universal approximation capability and even the standard back-propagation training algorithm used for their adjustment often provides results applicable to real-world problems. The present study deals with the issue of the multi-layered neural networks. It describes selected variants of training algorithms, mainly the standard back-propagation training algorithm and the scaled conjugate gradients algorithm, which ranks among the extremely fast second-order algorithms. One of the parts of the present study is also an application for the visualisation of the structure of multi-layered neural networks whose solution is designed with respect to its potential utilization in the education of artificial intelligence. The first part of the study introduces the subject matter and formally describes both algorithms, followed by a short description of other variants of the algorithms and their analysis. The next part discusses the selection of the appropriate programming language for the implementation of the application, specifies the goals and describes the implementation works. The conclusion summarizes the test results of the speed and implementation comparison with the selected noncommercial-based software ENCOG.
Knowledge Extraction with BP-netwoks
Reitermanová, Zuzana
Title: Knowledge Extraction with BP-networks Author: Zuzana Reitermanová Department: Katedra softwarového inženýrství Supervisor: Doc. RNDr. Iveta Mrázová, CSc. Supervisor's e-mail address: mrazova@ksi.ms.mff.cuni.cz Abstract: Multi-layered neural networks of the back-propagation type are well known for their universal approximation capability. Already the stan- dard back-propagation training algorithm used for their adjustment provides often applicable results. However, efficient solutions to complex tasks cur- rently dealt with require a quick convergence and a transparent network structure. This supports both an improved generalization capability of the formed networks and an easier interpretation of their function later on. Var- ious techniques used to optimize the structure of the networks like learning with hints; pruning and sensitivity analysis are expected to impact a bet- ter generalization, too. One of the fast learning algorithms is the conjugate gradient method. In this thesis, we discuss, test and analyze the above-mentioned methods. Then, we derive a new technique combining together the advantages of them. The proposed algorithm is based on the rapid scaled conjugate gradient tech- nique. This classical method is enhanced with the enforcement of a transpar- ent internal knowledge...

Interested in being notified about new results for this query?
Subscribe to the RSS feed.