Novel Algorithms for Noisy Minimisation Problems with Applications to Neural Networks Training

Liu, Steve Wenbin and Sirlantzis, Konstantinos and Lamb, John D. (2006) Novel Algorithms for Noisy Minimisation Problems with Applications to Neural Networks Training. Journal of Optimization Theory and Applications, 129 (2). pp. 325-340. ISSN 0022-3239. (The full text of this publication is not available from this repository)

The full text of this publication is not available from this repository. (Contact us about this Publication)
Official URL
http://dx.doi.org/10.1007/s10957-006-9066-z

Abstract

The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems.

Item Type: Article
Subjects: H Social Sciences > H Social Sciences (General)
Divisions: Faculties > Social Sciences > Kent Business School > Management Science
Depositing User: Jennifer Knapp
Date Deposited: 20 Aug 2010 11:20
Last Modified: 11 Jul 2014 13:06
Resource URI: http://kar.kent.ac.uk/id/eprint/25369 (The current URI for this page, for reference purposes)
  • Depositors only (login required):