Skip to main content

Modular neural networks applied to pattern recognition tasks

Gherman, Bogdan George (2016) Modular neural networks applied to pattern recognition tasks. Doctor of Philosophy (PhD) thesis, University of Kent,. (KAR id:57814)

Language: English
Download (6MB) Preview
[thumbnail of 25BGG-PhD-Thesis.pdf]
This file may not be suitable for users of assistive technology.
Request an accessible format


Pattern recognition has become an accessible tool in developing advanced adaptive products. The need for such products is not diminishing but on the contrary, requirements for systems that are more and more aware of their environmental circumstances are constantly growing. Feed-forward neural networks are used to learn patterns in their training data without the need to discover by hand the relationships present in the data. However, the problem of estimating the required size of the neural network is still not solved. If we choose a neural network that is too small for a particular given task, the network is unable to "comprehend" the intricacies of the data. On the other hand if we choose a network size that is too big for the given task, we will observe that there are too many parameters to be tuned for the network, or we can fall in the "Curse of dimensionality" or even worse, the training algorithm can easily be trapped in local minima of the error surface. Therefore, we choose to investigate possible ways to find the 'Goldilocks' size for a feed-forward neural network (which is just right in some sense), being given a training set. Furthermore, we used a common paradigm used by the Roman Empire and employed on a wide scale in computer programming, which is the "Divide-et-Impera" approach, to divide a given dataset in multiple sub-datasets, solve the problem for each of the sub-dataset and fuse the results of all the sub-problems to form the result for the initial problem as a whole. To this effect we investigated modular neural networks and their performance.

Item Type: Thesis (Doctor of Philosophy (PhD))
Thesis advisor: Sirlantzis, Konstantinos
Thesis advisor: Deravi, Farzin
Thesis advisor: Fairhurst, Michael
Uncontrolled keywords: modular artificial neural networks pattern recognition meta-measurements classifier systems polynomial fitting decision boundary statistical goodness-of-fit fine-tuning architecture modules hidden nodes synthetic data generation tangent lines
Subjects: Q Science
T Technology
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Engineering and Digital Arts
Funders: Organisations -1 not found.
Depositing User: Users 1 not found.
Date Deposited: 07 Oct 2016 15:00 UTC
Last Modified: 09 Dec 2022 18:54 UTC
Resource URI: (The current URI for this page, for reference purposes)
  • Depositors only (login required):


Downloads per month over past year