Skip to main content
Kent Academic Repository

Comments on "Pruning error minimization in least squares support vector machines"

Kuh, A., De Wilde, Philippe (2007) Comments on "Pruning error minimization in least squares support vector machines". IEEE Transactions on Neural Networks, 18 (2). pp. 606-609. ISSN 1045-9227. (doi:10.1109/TNN.2007.891590) (The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided) (KAR id:93360)

The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided.
Official URL:
https://doi.org/10.1109/TNN.2007.891590

Abstract

In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (gamma = infin). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (γ finite and nonzero) and is also computationally more efficient.

Item Type: Article
DOI/Identification number: 10.1109/TNN.2007.891590
Uncontrolled keywords: Least squares kernel methods, Online updating, Pruning, Regularization
Subjects: Q Science > QA Mathematics (inc Computing science) > QA 76 Software, computer programming,
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Computing
Depositing User: Philippe De Wilde
Date Deposited: 20 Dec 2022 09:43 UTC
Last Modified: 09 Jan 2023 11:23 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/93360 (The current URI for this page, for reference purposes)

University of Kent Author Information

  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.